This paper discusses the stock size selection problem (Chambers and Dyson, 1976), which is of relevance in the float glass industry. Given a fixed integer
The cutting and packing of stock are important problems in the metal, paper, wood, and glass industries (amongst others). Consequently, many researchers have considered these problems as mathematical optimization problems and derived good algorithms towards their solutions. In particular, the
A typical glass manufacturing plant receives hundreds of different sized orders per year for a single material and thickness of glass. A single order size will typically need to be cut hundreds, thousands, or tens of thousands of times to satisfy customer demand. In the production of float glass, a continuous ribbon of flat glass is produced in the manufacturing plant. This ribbon is cut online into large sizes (blanks) that are stored and cut as needed into specific order sizes. This twostage cutting process is carried out for various practical reasons: it is costly and sometimes impossible to cut the many different order sizes directly on the floatline, and it is also sometimes infeasible to store the many different order sizes in advance. Given expected order sizes and numbers, the
In this paper, we study the stock size selection problem as it applies to a local South African float glass manufacturing plant. Given a list of orders and a small positive integer,
Section
A float glass factory receives a roster of orders from numerous clients. These orders come in a wide variety of magnitudes, shapes, and sizes. For example, motor corporations may have large orders for windscreens, while there may be some clients that require a unique, single order.
We make three key assumptions about the process of cutting orders from blanks in this paper.
Each individual blank is only ever cut out into copies of a single order, as illustrated in Figure
Each order will be cut out from only one
The blank cannot be rotated before cutting the order.
The mapping relationship of many orders to one blank.
The partitioning of a blank with dimensions of an order
The last point above is relevant in the industry, because for certain types of flat glass it is necessary to preserve the direction of the grain relative to the order dimensions.
Referring to Figure
Let there be
The above function must be minimized over all possible blank types as well as assignments of orders to blanks for cutting. The latter values are in fact uniquely determined by the former. Notice how each term in the sum that makes up the objective function is independent of the others. The choice of
Before considering methods to solve the problem, we give some thought as to the search space, that is, the space of all possible dimensions of all the blanks. At first glance, this seems like a
This set is discrete and also bounded and can easily be enumerated.
Figure
Using orders to create possible blanks.
The problem is now fundamentally a combinatorial optimization problem. The objective is to optimally choose a finite, predefined number (
It is possible to restructure the optimization problem introduced above into a very suggestive form that is very simple to write down and reason about. To achieve this, a 2dimensional array, called the
We can write the matrix
A point in the search space corresponds to a selection of a set
Interestingly, this problem is similar to another combinatorial problem known as the
The objective of the
Like the glass cutting problem, which stores waste values in its cost matrix, the
Notice that
Constraint (
The glass cutting problem in this paper shares many concepts with the
Referring to Table
Concept Comparison of the
Concept 

Glass Cutting 

Objective  Minimise cost; see ( 
Minimise Wastage 
Task  Satisfy demand points with located facilities  Assign orders to be cut from generated blanks 
Finite constraint  Can locate 
Can choose 
Hard constraint  Every demand must be satisfied, see Constraint 8  Every order must be satisfied 
Integrality constraint  A demand point 
An order is cut within one blank and only one blank 
Recognizing its likeness to the
Constraint (
It may be prudent to visualise the forthcoming optimization procedure of the glass cutting problem. Selection and assignment ensue with the cost matrix. Consider the following cost matrix. This instance has 5 orders, seen by the number of rows, and 7 blanks, seen by the number of columns. Consider
Looking at this matrix, the mathematical problem can be defined as follows.
Considering a matrix
Table
Graphical representation of cost matrix







Result  


5  10**  3  14  12  6  13  10** 

4  3**  12  11  8  5  10  3** 

2  9**  7  11  14  6  2  9** 

1  12  5  20  11**  3  7  11** 

7  11  10  2**  6  9  3  2** 



35** 
Clearly, the choice of the columns (i.e., blanks) determines the objective function value of the problem. The assignment component is to some extent automatically performed once the columns have been chosen since the minimum coefficient of each row is chosen within the columns. The next section will explain the 3 optimization techniques utilised to effectively choose the columns.
Depthfirst search (DFS) is an algorithm that explores a tree or graph data structure. The search begins at the root node of the tree, usually resembling the starting state of a problem. Its strategy is to constantly seek to branch “deeper” from the current node. If the current node has no unexplored edges, the algorithm “backtracks” to the current node’s predecessor. The algorithm will attempt to branch again in the depthfirst manner. This process of backtracking and branching continues until all nodes reachable from the root node have been explored [
It is important to note that DFS will
The root node of the tree is the empty set: no columns were selected. The children of a node are obtained by appending to the set a single new column to the right of all the columns already in the set. The leaf nodes consist of sets of size
(1) Initialize cost matrix,
(2) Initialize number of blanks to select,
(3)
(4) Set
(5)
(6)
(7)
(8)
(9)
(10)
(11)
(12)
(13)
(14)
(15)
(16)
(17)
(18)
(19)
(20)
(21)
(22)
(23) DFS
(24)
(25)
(26)
(27)
(28)
(29)
(30)
(31)
As previously mentioned, depthfirst search needs to visit every point in the search space in order to find the optimal solution. This quickly becomes prohibitive as the search space gets large. Branchandbound method is a modified version of depthfirst search method that takes advantage of known bounds on the objective function value in order to prune the search tree down to a more manageable size. The method was first described in [
The basic idea behind branchandbound method is to keep track of an upper bound,
We describe below how we constructed our upper bounds and lower bounds for branch andbound method, as well as some useful preprocessing.
Any feasible point provides an upper bound on the minimization problem. To begin with, we made use of two heuristics to generate “good” feasible points that could be used as initial upper bounds for the branchandbound algorithm. The first heuristic begins with all the columns included in
As the algorithm progresses and finds new solutions, it keeps a constant record of the best solution found so far. This best solution acts as the upper bound when deciding whether to prune a node.
Before proceeding we point out a mathematical convenience that we will use when discussing the lower bounds. When at a specific node in the DFS tree we have already selected a set
Let us assume that the algorithm is at a node in the tree with
This is the first lower bound: the cost at any node is bounded below by the cost of including
Next, we ask what the
Now, for two columns,
This could be generalized to the result
Now, let
Thus this last sum is an upper bound on the maximum reduction possible. This yields the lower bound on the objective function:
This is the second lower bound. It can only be applied when there are no infinite values in
Because of the way we set up the branchandbound algorithm mentioned herein, a column towards the right of the cost matrix will never appear as the parent of a column is towards the left. This means that if we wish to prune a large number of nodes, we should ensure that the most important columns are towards the left of the matrix and the less important columns are towards the right. We presorted the columns of the cost matrix as follows.
Rank the entries in each row of the matrix from lowest cost to highest cost. Place
Find the column with the most first places (break ties with 2 s and then 3 s, etc.) and move it to the front of the matrix.
Repeat steps
The cost matrix for our data. Dark is high and low is light.
Nonlinear optimization by mesh adaptive direct search (NOMAD) is a software optimization package designed for numerous optimization problems. NOMAD provides a C++ implementation of the Mesh adaptive direct search (MADS) algorithm found in [
Each MADS iteration pertains to three steps, the poll, the search, and finally the update. The search allows for trial points to be created anywhere on the mesh, while the poll step is strictly more defined due to the reliance of the convergence on it. Ideally, the algorithm converges globally to a point
A loose description of this technique can be thought of as an arbitrary point selected and a local minimum found. Iteratively repeated this process leads to numerous local minimums. Providing enough iterations are conducted, a fairly accurate global minimum can be selected from this set. Now considering the problem at hand, a glass manufacturer may be dissatisfied with the time taken to find an exact solution using the DFS and branchandbound algorithms. NOMAD, however, offers the alternative of speed, albeit at the potential of an inexact solution. The results highlight such a comparison.
All algorithms and methods were implemented on a i73930k:12 cores @3.9 Ghz, 64 gigabytes of 1600 Mhz ram, 2
As was mentioned previously, the number of blanks that can be used to satisfy the orders is established a priori. It would be appropriate to investigate how this decision might affect the size of the solution space and hence the performance of the algorithms.
Being a combinatorial optimization problem, each solution is made up of a number of different choices. The fact that more blanks are available to choose from, as well as the number that we allow to be selected, invariably increases the problem's size as there is an increasing number of possible combinations.
Using our blank generation procedure and a hypothetical data set of orders, imagine that a total of 151 blanks are produced. Referring to Table
Problem size as a function of blanks.
Blanks to choose 
Number of choices 

2 

3 

4 

5 

6 

7 

8 

A log scale of solution size against choices of blanks.
As we can see from the above analysis the number of blanks that can be chosen will dramatically increase the number of possible solutions. Obviously, a higher number of possible blanks to start with will also increase the size of the search space. Recall that the entity that will determine how many blanks are generated is the order list since it is from these orders that possible blanks are derived. We now attempt to find a theoretical upper bound in an effort to get some idea of the order lists' influence on the problem size.
First, let us assume that there are
There are a minimum width,
Only considering the width component of (
In order to obtain an upper bound for the number of blanks generated for an order list we now need to sum, for all orders, the product of the number of elements in the intervals for
However, we can simplify (
The progression to (
Table
Algorithm execution times (seconds).
Selectable blanks  DFS  Branchandbound  NOMAD 

2  0.1448  0.3362  0.5622 
3  6.9312  3.4673  0.9184 
4  272.97  64.0814  9.7329 
5  8343.2  978.5097  1.7527 
6  —  12813.420  2.8240 
7  —  —  5.7266 
8  —  —  1.9721 
9  —  —  5.3426 
10  —  —  8.2825 
Scatter plot of execution times for algorithms.
We note that branch and bound, an algorithm that is an improvement on the brute force enumeration that comes with DFS, keeps a relatively low computation time up to about 4 selectable blanks. Looking at the area of interest, Figure
Scatter plot of execution times for algorithms higher resolution over smaller range (2–4 blanks).
NOMAD is consistently fast in its execution, having less than 10 seconds computation time for all cases; however, a heuristic often sacrifices solution quality for speed as we will see later.
The execution time results indicate that the branchandbound algorithm is significantly better than the DFS algorithm for this problem; it does more work at each node. This indicates that a significant portion of the search space is being pruned. Figure
Fraction of search space visited by branch and bound for
In the code for branchandbound we first implemented lower bound 1 and then for nodes that did not get pruned we tried lower bound 2. It was not at all clear that this second step would successfully prune any nodes after the first step failed. Figure
Fraction of nodes pruned due to lower bound 1 (left) and lower bound 2 (right).
Table
NOMAD solution quality.
Selectable blanks  Minimum wastage  NOMAD  NOMAD optimality error 

2  114693.440  165075.128  43.927% 
3  70775.004  70811.179  0.051% 
4  52261.038  54371.510  4.038% 
5  36800.455  60078.314  63.254% 
6  29048  36451.511  25.486% 
7  —  29177.763  — 
8  —  19118.633  — 
9  —  15719.224  — 
10  —  10721.666  — 
Percentage error from optimal values for NOMAD.
A trend that we notice with the results of all algorithms is the decreasing wastage with the increasing selectable blanks. This is not surprising since if we allow more blanks to be selected we enhance our capability to cater for all the orders, reducing wastage. This can be seen in Figure
Glass wastage versus number of blanks used.
This trend of decreasing wastage is not so extraordinary. Practitioners in the glass manufacturing industry are aware of the prospect of further reducing wastage in this manner. It is rather a question of simplicity versus optimality since operating the machinery at the factory becomes more complicated as one increases the number of blanks involved in satisfying the orders.
The selection of blanks to satisfy orders holds great significance in the glass manufacturing industry. Random and heuristic methods for guessing the best blank sizes tend to result in relatively high wastage. Minimising this loss translates to a meaningful benefit for a glass manufacturing enterprise. Furthermore, it is possible that these ideas will find application in the metal, paper, and wood industries (amongst others).
Making the transition to a discrete combinatorial problem proved worthwhile. Presented with any order list we are able to generate a set of feasible blanks. It provided us with the flexibility to apply tried and tested algorithms in the field of combinatorial optimization, such as branch and bound, to optimally select these blanks. Identifying a similarity with the
It was shown that the number of blanks in the search space is bounded linearly by the number of orders,
The branch and bound implementation performed well with larger problem sizes, whilst still providing an optimal solution. The upper and lower bound estimates and presorting of columns were effective at trimming the solution space, so that for
The authors declare that there is no conflict of interests regarding the publication of this paper.
The authors wish to thank Riaan Von Wielligh from PFG Building Glass for his assistance in the problem description as well as the members of the 2013 MISG South Africa study group.