In recent years Grammatical Evolution (GE) has been used as a representation of Genetic Programming (GP) which has been applied to many optimization problems such as symbolic regression, classification, Boolean functions, constructed problems, and algorithmic problems. GE can use a diversity of searching strategies including Swarm Intelligence (SI). Particle Swarm Optimisation (PSO) is an algorithm of SI that has two main problems: premature convergence and poor diversity. Particle Evolutionary Swarm Optimization (PESO) is a recent and novel algorithm which is also part of SI. PESO uses two perturbations to avoid PSO’s problems. In this paper we propose using PESO and PSO in the frame of GE as strategies to generate heuristics that solve the Bin Packing Problem (BPP); it is possible however to apply this methodology to other kinds of problems using another Grammar designed for that problem. A comparison between PESO, PSO, and BPP’s heuristics is performed through the nonparametric Friedman test. The main contribution of this paper is proposing a Grammar to generate online and offline heuristics depending on the test instance trying to improve the heuristics generated by other grammars and humans; it also proposes a way to implement different algorithms as search strategies in GE like PESO to obtain better results than those obtained by PSO.
1. Introduction
The methodology development to solve a specific problem is a process that entails the problem study and the analysis instances from such problem. There are many problems [1] for which there are no methodologies that can provide the exact solution, because the size of the problem search space makes it intractable in time, and it makes it necessary to search and improve methodologies that can give a solution in a finite time. There are methodologies based on Artificial Intelligence which do not yield exact solutions; those methodologies, however, provide an approximation, and among those we can find the following methodologies.
Heuristics are defined as “a type of strategy that dramatically limits the search for solutions” [2, 3]. One important characteristic of heuristics is that they can obtain a result for an instance problem in polynomial time [1], although heuristics are developed for a specific instance problem.
Metaheuristics are defined as “a master strategy that guides and modifies other heuristics to obtain solutions generally better that the ones obtained with a local search optimization” [4]. The metaheuristics can work over several instances of a given problem or various problems, but it is necessary to adapt the metaheuristics to work with each problem.
It has been shown that the metaheuristic Genetic Programming [5] can generate a heuristic that can be applied to an instance problem [6]. There also exist metaheuristics that are based on Genetic Programming’s paradigm [7] such as Grammatical Differential Evolution [8], Grammatical Swarm [9], Particle Swarm Programming [10], and Geometric Differential Evolution [11].
The Bin Packing Problem (BPP) has been widely studied because of its many Industrial Applications, like wood and glass cutting, packing in transportation and warehousing [12], and job scheduling on uniform processors [13, 14]. This is an NP-Hard Problem [1] and due to its complexity many heuristics have been developed attempting to give an approximation [15–19]. Some metaheuristics have also been applied to try to obtain better results than those obtained by heuristics [20–22]. Some exact algorithms have been developed [23–25]; however, given the nature of the problem the time reported by these algorithms grows up and depending on the instance the time may grow up exponentially.
The contribution of this paper is to propose a generic methodology to generate heuristics using GE with search strategies. It has been shown that is possible to use this methodology to generate BPP heuristics by using PESO and PSO as search strategies; it was also shown that the heuristics generated with the proposed Grammar have better performance than the BPP’s classical heuristics, which were designed by an expert in Operational Research. Those results were obtained by comparing the results obtained by the GE and the BPP heuristics by means of Friedman nonparametric test [26].
The GE is described in Section 2, including the PSO and PESO. Section 3 describes the Bin Packing Problem, the state-of-the-art heuristics, the instances used, and the fitness function. We describe the experiments performed in Section 4. Finally, general conclusions about the present work are presented in Section 5, including future perspectives of this work.
2. Grammatical Evolution
Grammatical Evolution (GE) [7] is a grammar-based form of Genetic Programming (GP) [27]. GE joins the principles of molecular biology, which are used by GP, and the power of formal grammars. Unlike GP, GE adopts a population of lineal genotypic integer strings, or binary strings, witch are transformed into functional phenotypic through a genotype-to-phenotype mapping process [28]; this process is also known as Indirect Representation [29]. The genotype strings evolve with no knowledge of their phenotypic equivalent, only using the fitness measure.
The transformation is governed through a Backus Naur Form grammar (BNF), which is made up of the tuple N,T,P,S, where N is the set of all nonterminal symbols, T is the set of terminals, P is the set of production rules that map N→T, and S is the initial start symbol where S∈N. There are a number of production rules that can be applied to a nonterminal; an “∣” (or) symbol separates the options.
Even though the GE uses the Genetic Algorithm (GA) [7, 28, 30] as a search strategy it is possible to use another search strategy like the Particle Swarm Optimization, called Grammatical Swarm (GS) [8].
In GE each individual is mapped into a program using the BNF, using (1) proposed in [28] to choose the next production based on the nonterminal symbol. An example of the mapping process employed by GE is shown in Figure 1. Consider
(1)Rule=c%r,
where c is the codon value and r is the number of production rules available for the current nonterminal.
An example of a transformation from genotype to phenotype using a BNF Grammar. It begins with the start symbol, if the production rule for this symbol is only one rule, then the production rule replaces the start symbol, and the process begins choosing the production rules based on the current genotype. It takes each genotype and the nonterminal symbol from the left to perform the next production using (1) until all the genotypes are mapped or there are not more nonterminals in the phenotype.
The GE can use different search strategies; our proposed model is shown in Figure 2. This model includes the problem instance and the search strategy as an input. In [28] the search strategy is part of the process; however it can be seen as an additional element that can be chosen to work with GE. The GE will generate a solution through the search strategy selected and it will be evaluated in the objective function using the problem instance.
GE’s methodology used in the present work: the presented methodology can be used with different search strategies.
2.1. Particle Swarm Optimization
Particle Swarm Optimization (PSO) [31–35] is a metaheuristic bioinspired in the flock of birds or school of fish. It was developed by Kennedy and Eberthart based on a concept called social metaphor. This metaheuristic simulates a society where all individuals contribute with their knowledge to obtain a better solution. There are three factors that influence the change of status or behavior of an individual.
The knowledge of the environment or adaptation: it is related to the importance given to the experience of the individual.
His experience or local memory: it is related to the importance given to the best result found by the individual.
The experience of their neighbors or global memory: this is related to how important is the best result obtained by their neighbors or other individuals.
In this metaheuristic each individual is considered as a particle and moves through a multidimensional space that represents the social space; the search space depends on the dimension of space which in turn depends on the variables used to represent the problem.
For the update of each particle we use the velocity vector which tells how fast the particle will move in each of the dimensions; the method for updating the speed of PSO is given by (2), and its position is updated by (3). Algorithm 1 shows the complete PSO algorithm:
(2)vi=wvi+ϕ1(xi-Bglobal)+ϕ2(xi-Blocal),(3)xi=xi+vi,
where
vi is the velocity of the ith particle,
w is adjustment factor to the environment,
ϕ1 is the memory coefficient in the neighborhood,
ϕ2 is the coefficient memory,
xi is the position of the ith particle,
Bglobal is the best position found so far by all particles,
Blocal is the best position found by the ith particle.
<bold>Algorithm 1: </bold>PSO Algorithm.
Require: w adaptation to environment coefficient, ϕ1 neighborhood memory
coefficient, ϕ2 memory coefficient, n swarm size.
(1) Start the swarm particles.
(2) Start the velocity vector for each particle in the swarm.
(3) while stopping criterion not met do
(4) fori=1 to ndo
(5) If the i-particle’s fitness is better than the local best then replace the
local best with the i-particle.
(6) If the i-particle’s fitness is better than the global best then replace the
global best with the i-particle.
(7) Update the velocity vector by (2).
(8) Update the particle’s position with the velocity vector by (3).
(9) end for
(10) end while
2.2. Particle Evolutionary Swarm Optimization
Particle Evolutionary Swarm Optimization (PESO) [36–38] is based on PSO but introduces two perturbations in order to avoid two problems observed in PSO [39]:
premature convergence,
poor diversity.
Algorithm 2 shows the PESO Algorithm with two perturbations, Algorithms 3 and 4. The C-Perturbation has the advantage of keeping the self-organization potential of the flock as no separate probability distribution needs to be computed; meanwhile the M-Perturbation helps keeping diversity into the population.
<bold>Algorithm 2: </bold>PESO Algorithm.
Require: w adaptation to environment coefficient, ϕ1 neighborhood memory
coefficient, ϕ2 memory coefficient, n swarm size.
(1) Start the swarm particles.
(2) Start the velocity vector for each particle in the swarm.
(3) while stopping criterion not met do
(4) fori=1 to ndo
(5) If the i-particle’s fitness is better than the local best then replace the
local best with the i-particle.
(6) If the i-particle’s fitness is better than the global best then replace the
global best with the i-particle.
(7) Update the velocity vector by (2).
(8) Update the particle’s position with the velocity vector by (3).
(9) Apply the C-Perturbation
(10) Apply the M-Perturbation
(11) end for
(12) end while
<bold>Algorithm 3: </bold>C-Perturbation.
(1) for all Particles do
(2) Generate r uniformly between 0 and 1.
(3) Generate p1, p2 and p3 as random numbers between 1 and the number
of particles.
(4) Generate the i-new particle using the following equation and applying it
to each particle dimension: newi=p1+r(p2-p3).
(5) end for
(6) for all Particles do
(7) If the i-new particle is better that the i-particle then replace the i-particle
with the i-new particle.
(8) end for
<bold>Algorithm 4: </bold>M-Perturbation.
(1) for all Particles do
(2) for all Dimension do
(3) Generate r uniformly between 0 and 1.
(4) ifr≤1/dimensionthen
(5) newid=random(LowerBound,UpperBound)
(6) else
(7) newid=Particled
(8) end if
(9) end for
(10) end for
(11) for all Particles do
(12) If the i-new particle is better that the i-particle then replace the i-particle
with the i-new particle.
(13) end for
3. Bin Packing Problem
The Bin Packing Problem (BPP) [40] can be described as follows: given n items that need to be packed in the lowest possible number of bins, each item has a weight wj, where j is the element; the max capacity of the bins c is also available. The objective is to minimize the bins used to pack all the items, given that each item is assigned only to one bin, and the sum of all the items in the bin can not exceed the bin’s size.
This problem has been widely studied, including the following:
proposing new theorems [41, 42],
developing new heuristic algorithms based on Operational Research concepts [18, 43],
characterizing the problem instances [44–46],
implementing metaheuristics [20, 47–49].
This problem has been shown to be an NP-Hard optimization problem [1]. A mathematical definition of the BPP is as follows:
Minimize
(4)z=∑i=1nyi,
subject to the following constraints and conditions:
(5)∑j=1nwjxij≤cyi,i∈N={1,…,n},∑i=1nxij=1,j∈N,yi∈{0,1},i∈N,xij∈{0,1},i∈N,j∈N,
where
wj is weight of the j item,
yi is binary variable that shows if the bin i has items,
xij indicates whether the j item is into the i bin,
n is number of available bins,
c is capacity of each bin.
The algorithms for the BPP instances can be classified as online or offline [46]. We have algorithms considered online if we do not know the items before starting the packing process and offline if we know all the items before starting. In this research we worked with both algorithms.
3.1. Tests Instances
Beasley [50] proposed a collection of test data sets, known as OR-Library and maintained by the Beasley University, which were studied by Falkenauer [21]. This collection contains a variety of test data sets for a variety of Operational Research problems, including the BPP in several dimensions. For the one-dimensional BPP case the collection contains eight data sets that can be classified in two classes.
Unifor. The data sets from binpack1 to binpack4 consist of items of sizes uniformly distributed in (20, 100) to be packed into bins of size 150. The number of bins in the current known solution was found by [21].
Triplets. The data sets from binpack5 to binpack8 consist of items from (24, 50) to be packed into bins of size 100. The number of bins can be obtained dividing the size of the data set by three.
Scholl et al. [23] proposed another collection of data sets; only 1184 problems were solved optimally. Alvim et al. [51] reported the optimal solutions for the remaining 26 problems. The collection contains three data sets.
Set 1. It has 720 instances with items drawn from a uniform distribution on three intervals [1,100], [20,100], and [30,100]. The bin capacity is C=100, 120, and 150 and n=50, 100, 200, and 500.
Set 2. It has 480 instances with C=1000 and n=50, 100, 200, and 500. Each bin has an average of 3–9 items.
Set 3. It has 10 instances with C=100,000, n=200, and items are drawn from a uniform distribution on [20000,35000]. Set 3 is considered the most difficult of the three sets.
3.2. Classic Heuristics
Heuristics have been used to solve the BPP, obtaining good results. Reference [18] shows the following heuristics as Classical Heuristics; these heuristics can be used as online heuristics if the items need to be packed as they come in or offline heuristics if the items can be sorted before starting the packing process.
Best Fit [17] puts the piece in the fullest bin that has room for it and opens a new bin if the piece does not fit in any existing bin.
Worst Fit [18] puts the piece in the emptiest bin that has room for it and opens a new bin if the piece does not fit in any existing bin.
Almost Worst Fit [18] puts the piece in the second emptiest bin if that bin has room for it and opens a new bin if the piece does not fit in any open bin.
Next Fit [15] puts the piece in the right-most bin and opens a new bin if there is not enough room for it.
First Fit [15] puts the piece in the left-most bin that has room for it and opens a new bin if it does not fit in any open bin.
Even though there are some heuristics having better performance than the heuristics shown in the present section [16, 19, 42, 52, 53], such heuristics have been the result of research of lower and upper bounds to determine the minimal number of bins.3.3. Fitness Measure
There are many Fitness Measures used to discern the results obtained by heuristics and metaheuristics algorithms. In [54] two fitness measures are shown: the first measure (see (6)) tries to find the difference between the used bins and the theorical upper bound on the bins needed; the second (see (7)) was proposed in [47] and rewards full or almost full bins; the objective is to fill each bin, minimizing the free space:
(6)Fitness=B-∑i=1nwiC,(7)Fitness=1-(∑i=1n((∑j=1mwjxij)/C)2n),
where
Algorithm 5 shows the proposed approach; this approach allows the use of different fitness functions and search strategies to generate heuristics automatically.
<bold>Algorithm 5: </bold>Proposed approach.
Require: SS search strategy, FF Fitness Function, BNF-G Grammar, IS
Instances Set.
(1) for all Instance Set into Instances Set do
(2) Select randomly an Instance from the Instance Set
(3) Start an initial population.
(4) while stopping criterion not met do
(5) Apply the mapping process using the Grammar BNF-G, as seen in
the Figure 1, to obtain an heuristic by each element into the population.
(6) Calculate the fitness value, using FF, for each element into the population
applying the heuristic generated to the instance selected.
(7) Apply the search strategy to optimize the elements into the population.
(8) end while
(9) Apply the found heuristic to all instances from the Instance Set.
(10) end for
(11) return Heuristic for each instance set.
To improve the Bin Packing Heuristics it was necessary to design a grammar that represents the Bin Packing Problem. In [55] Grammar 1 is shown to be based on heuristic elements taken by [6]; however the results obtained in [1] give 10% of solutions that can not be applied to the instance and for this reason this approach does not need to be included to be compared against the results obtained.
That Grammar has been improved in the Grammar 2 [56] to obtain similar results to those obtained by the BestFit heuristic. However this grammar cannot be applied to Bin Packing offline Problems because it does not sort pieces. Grammar 3 is proposed to improve the results obtained by Grammar 2, given that it can generate heuristics online and offline:
(9)〈inicio〉⊨(〈expr〉)<=(〈expr〉)〈expr〉⊨(〈expr〉〈op〉〈expr〉)∣〈var〉∣abs(〈expr2〉)〈expr2〉⊨(〈expr2〉〈op〉〈expr2〉)∣〈var〉〈var〉⊨F∣C∣S〈op〉⊨+∣*∣-∣/.
Grammar 1.
Grammar based on FirstFit Heuristic was proposed in [55] and we use the Heuristic Components shown in [57]
(10)〈inicio〉⊨〈exprs〉·(〈expr〉)<=(〈expr〉)〈exprs〉⊨Sort(〈exprk〉,〈order〉)∣λ〈exprk〉⊨Bin∣Content〈order〉⊨Asc∣Des〈expr〉⊨(〈expr〉〈op〉〈expr〉)∣〈var〉∣abs(〈expr2〉)〈expr2〉⊨(〈expr2〉〈op〉〈expr2〉)∣〈var〉〈var〉⊨F∣C∣S〈op〉⊨+∣*∣-∣/.
Grammar 2.
Grammar proposed in [56] was based on BestFist Heuristic:
(11)〈begin〉⊨〈exproff〉〈exprsort〉(〈expr〉)<=(〈expr〉)〈Exproff〉⊨Sort(Elements,〈order〉)∣λ〈exprsort〉⊨Sort(〈exprkind〉,〈order〉)∣λ〈exprkind〉⊨Bins∣SumElements〈order〉⊨Asc∣Des〈expr〉⊨(〈expr〉〈op〉〈expr〉)∣〈var〉∣abs(〈expr2〉)〈expr2〉⊨(〈expr2〉〈op〉〈expr2〉)∣〈var〉〈var〉⊨F∣C∣S〈op〉⊨+∣*∣-∣/.
Grammar 3.
Grammar proposal to generate heuristics online and offline is based on Grammar 2, where
S is size of the current piece,
C is bin capacity,
F is sum of the pieces already in the bin,
Elements sorts the elements,
Bin sorts the bins based on the bin number,
Cont sorts the bins based on the bin contents,
Asc sorts in ascending order,
Des sorts in descending order.
In order to generate the heuristics Grammar 3 was used. The search strategies applied to the GE were PESO and PSO. The number of function calls was taken from [56], where it was explained that this number is only 10% from the number of function calls used by [6]. To obtain the parameters shown in Table 1 a fine-tuning process was applied based on Covering Arrays (CA) [58]; in this case the CA was generated using the Covering Array Library (CAS) [59] from The National Institute of Standards and Technology (NIST) (http://csrc.nist.gov/groups/SNS/acts/index.html).
PESO and PSO parameters.
Parameter
Value
Population size
50
w
1.0
ϕ1
0.8
ϕ2
0.5
Function calls
1500
In order to generate the heuristics, one instance from each set was used. Once the heuristic was obtained for each instance set, it was applied to all the sets to obtain the heuristic’s fitness. The instance sets used were detailed in Section 3.1. 33 experiments were performed independently and the median was used to compare the results against those obtained with the heuristics described in Section 3. The comparison was implemented through the nonparametric test of Friedman [26, 60]; this nonparametric test used a post hoc analysis to discern the performance between the experiments and gives a ranking of them.
The method to apply the heuristics generated by Grammar 3 for an instance set is described below.
For each instance in the instance set the generated heuristic will be applied.
The generated heuristic has the option to sort the items before starting the packing process, to treat the instances like offline instances.
The next part of the generated heuristic says how to sort the bins; many heuristics require sorting the bins before packing an item.
The last part, the inequality, determines the rule to pack an item.
Sometimes the generated heuristic does not have items ordered and it makes the heuristic work like an online heuristic. If it does not have the bins ordered all the items will be packed into the bins in the order they were created.5. Results
In Table 2 the results obtained with online and offline heuristics (described in Section 3.2) are shown. Results obtained by an exact algorithm were included, the MTP algorithm [40], and results from the fitness function from Section 3.3 are shown as well with the number of bins used. A row was added where the difference of containers regarding the optimal is shown. These results were obtained by applying the heuristics to each instance; all the results from an instance set were added.
Results obtained by each heuristic using (2).
Instance
Fitness function
MTP algorithm
Online heuristics
Offline heuristics
BestFit
FirstFit
NextFit
WorstFit
Almost WorstFit
BestFit
FirstFit
NextFit
Almost WorstFit
WorstFit
bin1data
Equation (7)
64.307365
316.106050
316.106020
316.106020
316.106000
321.880250
68.090770
68.167305
314.939820
86.119570
76.842740
Bins used
78378
101097
101097
101097
101097
101705
78660
78661
101097
79314
78843
Leftover bins
—
22719
22719
22719
22719
23327
282
283
22719
465
936
bin2data
Equation (7)
24.725332
93.025230
93.388504
109.447590
97.613075
115.927666
44.560112
44.561455
110.053760
67.597020
47.655037
Bins used
20246
23085
23094
23518
23158
23580
20994
20994
23615
21446
21030
Leftover bins
—
2839
2848
3272
2912
3334
748
748
3369
784
1200
bin3data
Equation (7)
0.390077
1.885825
1.885825
2.586886
2.094760
2.253316
1.390289
1.390289
2.699653
1.518808
1.397016
Bins used
562
613
613
642
622
631
596
596
650
603
596
Leftover bins
—
51
51
80
60
69
34
34
88
34
41
binpack1
Equation (7)
0.421557
2.425894
2.604965
7.941016
5.089604
5.365787
0.913949
0.914034
9.504668
1.479180
1.233465
Bins used
981
1038
1044
1279
1131
1147
995
995
1372
1016
1003
Leftover bins
—
57
63
298
150
166
14
14
391
22
35
binpack2
Equation (7)
0.180042
2.259154
2.396851
7.989118
4.916397
4.992059
0.705970
0.706004
9.459408
1.043219
0.846457
Bins used
2032
2154
2162
2669
2342
2353
2062
2062
2851
2087
2068
Leftover bins
—
122
130
637
310
321
30
30
819
36
55
binpack3
Equation (7)
0.097509
2.014334
2.133326
7.994689
4.725924
4.769628
0.591541
0.591543
9.412333
0.746746
0.666438
Bins used
4024
4240
4255
5300
4614
4626
4078
4078
5647
4100
4085
Leftover bins
—
216
231
1276
590
602
54
54
1623
61
76
binpack4
Equation (7)
0.047260
1.838669
1.935710
7.961214
4.597339
4.622161
0.495512
0.495522
9.404397
0.622414
0.572282
Bins used
8011
8407
8430
10548
9154
9167
8108
8108
11253
8141
8123
Leftover bins
—
396
419
2537
1143
1156
97
97
3242
112
130
binpack5
Equation (7)
0.000000
0.000000
0.000000
0.000000
0.000000
1.324353
4.830872
4.830872
6.420693
5.453772
4.848756
Bins used
400
400
400
400
400
420
464
464
491
479
464
Leftover bins
—
0
0
0
0
20
64
64
91
64
79
binpack6
Equation (7)
0.000000
0.000000
0.000000
0.000000
0.000000
0.682040
4.553619
4.553619
6.197923
4.994128
4.557339
Bins used
800
800
800
800
800
820
916
916
971
936
916
Leftover bins
—
0
0
0
0
20
116
116
171
116
136
binpack7
Equation (7)
0.000000
0.000000
0.000000
0.000000
0.000000
0.334901
4.556952
4.556953
6.078939
4.776067
4.558652
Bins used
1660
1660
1660
1660
1660
1680
1900
1900
2002
1919
1900
Leftover bins
—
0
0
0
0
20
240
240
342
240
259
binpack8
Equation (7)
0.000000
0.000000
0.000000
0.000000
0.000000
0.167147
4.400591
4.400588
6.059548
4.527734
4.400708
Bins used
3340
3340
3340
3340
3340
3360
3801
3801
4024
3823
3801
Leftover bins
—
0
0
0
0
20
461
461
684
461
483
hard28
Equation (7)
0.167276
0.655480
0.655350
13.133352
1.547310
1.927801
0.655480
0.655350
13.133352
1.927801
1.547310
Bins used
1972
1995
1995
2755
2024
2050
1995
1995
2755
2050
2024
Leftover bins
—
23
23
783
52
78
23
23
783
52
78
Remaining bins
26423
26484
31602
27936
29133
2163
2164
34322
2447
3508
Table 3 shows examples of heuristics generated using the proposed Grammar with GE for each instance set; some heuristics can be reduced but this is not part of the present work.
Example of heuristics obtained for each instance set using Grammar 3.
Sort(Elements, Des) · Sort(Cont, Des) · (S) ≤ ((C-F))
binpack1
Sort(Content, Des) · (abs(F)) ≤ ((C − abs(S)))
binpack2
Sort(Content, Des) · ((F+S)) ≤ (C)
binpack3
Sort(Content, Des) · (F) ≤ (abs((C-S)))
binpack4
Sort(Content, Asc) · (S) ≤ ((C-F))
binpack5
((S+F)) ≤ (C)
binpack6
(F) ≤ ((abs(C) − S))
binpack7
(abs(F)) ≤ (abs((S-C)))
binpack8
(abs((S+F))) ≤ (C)
hard28
Sort(Cont, Des) · (F) ≤ (abs((C-S)))
The results obtained by the PSO and PESO with the Grammars are shown in Table 4; these results are the median from 33 individual experiments. Using the results obtained by the heuristics and the GE with PESO and PSO the Friedman nonparametric test was performed to discern the results. The value obtained by the Friedman nonparametric test is 85.789215 and the P value 6.763090E-11; this means that the tested heuristics have different performance. Due to this it was necessary to apply a post hoc procedure to obtain the Heuristics Ranking shown in Table 5.
Results obtained by each heuristic over the instance set.
Instance
Fitness function
PSO
PESO
Grammar 1
Grammar 2
Grammar 3
Grammar 1
Grammar 2
Grammar 3
bin1data
Equation (7)
316.106020
316.106050
68.167305
316.106020
316.106050
68.090770
Bins used
101097
101097
101097
101097
101097
78660
Leftover bins
22719
22719
22719
22719
22719
282
bin2data
Equation (7)
93.388510
93.025230
44.561455
93.388510
93.007030
44.560112
Bins used
23094
23156
23583
23097
23156
20994
Leftover bins
2848
2910
3337
2851
2910
748
bin3data
Equation (7)
1.885825
1.885825
1.390289
1.885825
1.885825
1.390289
Bins used
613
622
650
613
622
596
Leftover bins
51
60
88
51
60
34
binpack1
Equation (7)
2.604965
2.425894
0.914034
2.604965
2.425894
0.913949
Bins used
1044
1131
1372
1044
1131
995
Leftover bins
63
150
391
63
150
14
binpack2
Equation (7)
2.396851
2.259154
0.706004
2.396851
2.259154
0.705970
Bins used
2162
2342
2851
2162
2342
2062
Leftover bins
130
310
819
130
310
30
binpack3
Equation (7)
2.133326
2.014334
0.591543
2.133326
2.014334
0.591541
Bins used
4255
4614
5647
4255
4614
4078
Leftover bins
231
590
1623
231
590
54
binpack4
Equation (7)
1.935710
1.838669
0.495522
1.935710
1.838669
0.495512
Bins used
8430
9154
11253
8430
9154
8108
Leftover bins
419
1143
3242
419
1143
97
binpack5
Equation (7)
0.000000
0.000000
0.000000
0.000000
0.000000
0.000000
Bins used
400
400
400
400
400
400
Leftover bins
0
0
0
0
0
0
binpack6
Equation (7)
0.000000
0.000000
0.000000
0.000000
0.000000
0.000000
Bins used
800
800
800
800
800
800
Leftover bins
0
0
0
0
0
0
binpack7
Equation (7)
0.000000
0.000000
0.000000
0.000000
0.000000
0.000000
Bins used
1660
1660
1660
1660
1660
1660
Leftover bins
0
0
0
0
0
0
binpack8
Equation (7)
0.000000
0.000000
0.000000
0.000000
0.000000
0.000000
Bins used
3340
3340
3340
3340
3340
3340
Leftover bins
0
0
0
0
0
0
hard28
Equation (7)
0.655350
0.655480
0.655350
0.655350
0.655480
0.655480
Bins used
1995
2024
2755
1995
2024
1995
Leftover bins
23
52
783
23
52
23
Remaining bins
26484
27934
33002
26487
27934
1282
Rankings of the algorithms.
Algorithm
Ranking
Friedman
Exact
2.666667
PESO-Grammar 3
4.375000
PSO-Grammar 3
4.791667
BestFit-Offline
6.916667
FirstFit-Offline
7.250000
PESO-Grammar 2
8.666667
BestFit
8.791667
PSO-Grammar 2
8.791667
FirstFit
8.958333
PESO-Grammar 1
9.083333
PSO-Grammar 1
9.083333
WorstFit-Offline
9.541667
AlmostWorstFit-Offline
10.625000
WorstFit
10.791667
NextFit
12.250000
AlmostWorstFit
14.291667
NextFit-Offline
16.125000
Both Tables 2 and 4 have an extra row at the bottom with the total remaining bins. The results obtained by PESO using Grammar 3 show that this heuristic which has been deployed automatically has less bins than the other classic heuristics.
6. Conclusions and Future Works
In the present work a Grammar was proposed to generate online and offline heuristics in order to improve heuristics generated by other grammars and by humans. It also was proposed using PESO as a search strategy based on Swarm Intelligence to avoid the problems observed in PSO.
Through the results obtained in Section 5, it was concluded that it is possible to generate good heuristics with the proposed Grammar. Additionally it can be seen that the quality of these heuristics strongly depends on the grammar used to evolve.
The grammar proposed in the present work shows that is possible to generate heuristics with better performance that the well-known BestFit, FirstFit, NextFit, WorstFit, and Almost WorstFit heuristics from Section 3.2 regardless of heuristics being online or offline. While the heuristics are designed to work with all the instances sets, the GE adjusts heuristics automatically to work with one instance set and it makes it possible for GE to generate offline or online heuristics. The GE can generate as many heuristics as instances sets that have been working and try to adapt the best heuristic that can be generated with the used Grammar.
The results obtained by PESO are better than those obtained by PSO by using Grammars 2 and 3, but with Grammar 1 PESO and PSO have the same performance.
The current investigation is based on the one-dimensional bin packing problem but this methodology can be used to solve other problems, due to the generality of the approach. It is necessary to apply heuristic generation to other problems and investigate if the GE with PESO as search strategy gives better results than the GP or GE with other search strategies.
It will be necessary to find a methodology to choose the instance or instances for the training process as well as to determine if the instances are the same or to classify the instances in groups with the same features to generate only one heuristic by group.
It will also be necessary to research other metaheuristics that do not need the parameter tuning because the metaheuristics shown in the present paper were tuned using Covering Arrays.
Conflict of Interests
The authors declare that there is no conflict of interests regarding the publication of this paper.
Acknowledgments
The authors want to thank to the Instituto Tecnológico de León (ITL) for the support provided for this research. Additionally the authors want to aknowledge the generous support from the Consejo Nacional de Ciencia y Tecnológia (CONACyT) from Mexico for this research project.
GareyM. R.JohnsonD. S.FeigenbaumE. A.FeldmanJ.RomanyciaM. H. J.PelletierF. J.What is a heuristic?GloverF.Future paths for integer programming and links to artificial intelligenceKozaJ. R.Hierarchical genetic algorithms operating on populations of computer programsProceedings of the 11th International Joint Conference on Artificial Intelligence1989San Mateo, Calif, USA768774BurkeE. K.HydeM.KendallG.RunarssonT.BeyerH.-G.Merelo-GuervósJ.WhitleyL.YaoX.Evolving bin packing heuristics with genetic programmingRyanC.CollinsJ.O'NeillM.Grammatical evolution: evolving programs for an arbitrary language1391Proceedings of the 1st European Workshop on Genetic Programming1998Springer8395Lecture Notes in Com puter ScienceO'NeillM.BrabazonA.Grammatical differential evolutionProceedings of the International Conference on Artificial Intelligence (ICAI '06)2006Las Vegas, Nev, USACSEA PressO'NeillM.BrabazonA.Grammatical swarm: the generation of programs by social programmingTogeliusJ.de NardiR.MoraglioA.Geometric PSO + GP = particle swarm programmingProceedings of the IEEE Congress on Evolutionary Computation (CEC '08)June 2008Hong Kong3594360010.1109/CEC.2008.46312842-s2.0-55749110379MoraglioA.SilvaS.Esparcia-AlcazarA.EkartA.SilvaS.DignumS.UyarA.Geometric di ff ere ntial evolution on the space of genetic programsLodiA.MartelloS.VigoD.Recent advances on two-dimensional bin packing problemsvan de VelH.ShijieS.Application of the bin-packing technique to job scheduling on uniform processorsHanB. T.DiehrG.CookJ. S.Multiple-type, two-dimensional bin packing problems: applications and algorithmsJohnsonD. S.DemersA.UllmanJ. D.GareyM. R.GrahamR. L.Worst-case performance bounds for simple one-dimensional packing algorithmsYaoA. C. C.New algorithms for bin packingRheeW. T.TalagrandM.On-line bin packing with items of random sizeCoffmanE.Jr.GalambosG.MartelloS.VigoD.FleszarK.HindiK. S.New heuristics for one-dimensional bin-packingKämpkeT.Simulated annealing: use of a new tool in bin packingFalkenauerE.A hybrid grouping genetic algorithm for bin packingPonee-PérezA.Pérez-GarciaA.Ayala-RamirezV.Bin-packing using genetic algorithmsProceedings of the 15th International Conference on Electronics, Communications and Computers (CONIELECOMP '05)March 2005Los Alamitos, Calif, USAIEEE Computer Society31131410.1109/CONIEL.2005.252-s2.0-33745122989SchollA.KleinR.JürgensC.Bison: A fast hybrid procedure for exactly solving the one-dimensional bin packing problemde CarvalhoJ. M. V.Exact solution of bin-packing problems using column generation and branch-and-boundPuchingerJ.RaidlG.MiraJ.AlvarezJ.Combining metaheuristics and exact algorithms in combinatorial optimization: a survey and classificationDerracJ.GarcíaS.MolinaD.HerreraF.A practical tutorial on the use of nonparametric statistical tests as a methodology for comparing evolutionary and swarm intelligence algorithmsKozaJ. R.PoliR.Burke andE. K.KendallG.Genetic programmingDempseyI.O'NeillM.BrabazonA.Foundations in grammaticallan FangH.RossP.CorneD.A promising genetic algorithm approach to job -shop scheduling, rescheduling, and open-shop scheduling problemsProceedings of the 5th International Conference on Genetic Algorithms1993Burlington, Mass, USAMorgan Kaufmann375382HollandJ. H.KennedyJ.EberhartR.Particle swarm optimizationProceedings of the IEEE International Conference on Neural NetworksDecember 1995194219482-s2.0-0029535737MauriceC.PoliR.KennedyJ.BlackwellT.Particle swarm optimizationTasgetirenM. F.SuganthanP. N.PanQ.A discrete particle swarm optimization algorithm for the generalized traveling salesman problemProceedings of the 9th Annual Genetic and Evolutionary Computation Conference (GECCO '07)July 2007New York, NY, USA15816710.1145/1276958.12769802-s2.0-34548086557GongT.TusonA. L.Binary particle swarm optimization: a forma analysis approachProceedings of the 9th annual conference on Genetic and evolutionary computation (GECCO '07)July 2007New York, NY, USAACM17210.1145/1276958.12769862-s2.0-34548071466ZavalaA. E. M.AguirreA. H.Villa DiharceE. R.Constrained optimization via Particle Evolutionary Swarm Optimization algorithm (PESO)Proceedings of the Conference on Genetic and Evolutionary Computation (GECCO '05)June 2005New York, NY, USA2092162-s2.0-32444435464ZavalaA. E. M.AguirreA. H.Villa DiharceE. R.Particle evolutionary swarm optimization algorithm (PESO)Proceedings of the 6th Mexican International Conference on Computer Science (ENC '05)September 2005Puebla, Mexico28228910.1109/ENC.2005.322-s2.0-33947255992Muñoz-ZavalaA. E.Hernández-AguirreA.Villa-DiharceE. R.Botello-RiondaS.PESO+ for constrained optimizationProceedings of the IEEE Congress on Evolutionary Computation (CEC '06)July 20062312382-s2.0-34547290965PulidoG. T.CoelloC. A. C.A constraint-handling mechanism for particle swarm optimization2Proceedings of the Congress on Evolutionary Computation (CEC '04)June 2004Portland, Ore, USA139614032-s2.0-434469397710.1109/CEC.2004.1331060MartelloS.TothP.CoffmanE. G.Jr.CourcoubetisC.GareyM. R.ShorP. W.WeberR. R.Bin packing with discrete item sizes. I. Perfect packing theorems and the average case behavior of optimal packingsCrainicT. G.PerboliG.PezzutoM.TadeiR.New bin packing fast lower boundsCoffmanJ.GalambosG.MartelloS.VigoD.Bin packing approximation algorithms: combinatorial analysisFeketeS. P.SchepersJ.New classes of fast lower bounds for bin packing problemsSeidenS. S.van SteeR.EpsteinL.New bounds for variable-sized online bin packingCoffmanE. G.Jr.CsirikJ.A classification scheme for bin packing theoryFalkenauerE.DelchambreA.A genetic algorithm for bin packing and line balancing2Proceedings of the IEEE International Conference on Robotics and AutomationMay 1992118611922-s2.0-0027005303LodiA.MartelloS.VigoD.Heuristic and metaheuristic approaches for a class of two-dimensional bin packing problemsHopperE.TurtonB. C. H.A review of the application of meta-heuristic algorithms to 2D strip packing problemsBeasleyJ. E.OR-Library: distributing test problems by electronic mailAlvimA. C. F.RibeiroC. C.GloverF.AloiseD. J.A hybrid improvement heuristic for the one-dimensional bin packing problemSuárezC. D. T.GonzlezE. P.RendónM. V.A heuristic algorithm for the offline one-dimensional bin packing problem inspired by the point Jacobi matrix iterative methodProceedings of the 5th Mexican International Conference on Artificial Intelligence (MICAI '06)November 2006Mexico City, Mexico28128610.1109/MICAI.2006.42-s2.0-34547676299TamS.TamH.TamL.ZhangT.A new optimization method, the algorithm of changes, for Bin Packing ProblemProceedings of the IEEE 5th International Conference on Bio-Inspired Computing: Theories and Applications (BIC-TA '10)September 201099499910.1109/BICTA.2010.56451222-s2.0-78650593552HydeM.Sotelo-FigueroaM. A.Puga SoberanesH. J.Martin CarpioJ.Fraire HuacujaH. J.ReyesC. L.Soria-AlcarazJ. A.CastilloO.MelinP.KacprzykJ.Evolving bin packing heuristic using micro-differential evolution with indirect representationSotelo-FigueroaM.Puga SoberanesH.Martin CarpioJ.Fraire HuacujaH.Cruz ReyesL.Soria-AlcarazJ.Evolving and reusing bin packing heuristic through grammatical differential evolutionProceedings of the World Congress on Nature and Biologically Inspired Computing (NaBIC '13)August 2013Fargo, ND, USA929810.1109/NaBIC.2013.6617844BurkeE. K.KendallG.Rodriguez-CristernaA.Torres-JimenezJ.Rivera-IslasI.Hernandez-MoralesC.Romero-MonsivaisH.Jose-GarciaA.BatyrshinI.SidorovG.A mutation-selection algorithm for the problem of minimum brauer chainsKackerR. N.Richard KuhnD.LeiY.LawrenceJ. F.Combinatorial testing for software: an adaptation of design of experimentsSheskinD. J.