Many optimization problems (from academia or industry) require the use of a local search to find a satisfying solution in a reasonable amount of time, even if the optimality is not guaranteed. Usually, local search algorithms operate in a search space which contains complete solutions (feasible or not) to the problem. In contrast, in
An
A
Within a local search context, the usual approach consists in working with complete solutions, that is, each variable has a value and the solution might be feasible or not. In the latter case, a penalty function is often used, which depends on the number of violated constraints. In contrast, in
The paper is organized as follows. In Section
In this section, we introduce the CNS methodology and situate it within the optimization methods.
Let
Therefore, three search spaces are possible: (1) the complete and feasible search space
An important feature of CNS is the definition of the neighborhood structure in
In contrast, any move
Therefore, the distance between
In most local search algorithms, the selected neighbor solution
In contrast, in CNS,
We have now all the ingredients to formulate a pseudo-code of CNS in Algorithm
1. initialize the value of the best move: set 2. generate the best move: for each non assigned variable (a) (b) without augmenting the number of violations, do it and remove (c) let (d) update the best candidate move: if 3. resulting new current solution; 4. update the record: if
In summary, CNS is an approach dealing with partial feasible solutions, which can explore the whole neighborhood of the current solution at each iteration because a straightforward incremental computation can be designed. Many local search methods (e.g., tabu search, simulated annealing, random walk, threshold algorithms, etc.) can be adapted within the framework of CNS.
The adaptation of tabu search within the framework of CNS is now discussed. A generic and standard version of tabu search can be described as follows, assuming that
Tabu search adapted within the framework of CNS has the following specificities: working in
We now compare the general strategy of three kinds of optimization methods: tree search, standard local search, and CNS. These methods have a very different way to visit the search tree, where the
Tree search algorithms visit neighbor nodes in the search tree. The visited subtree
A very different strategy characterizes standard local search methods: as illustrated in Figure
Even if CNS can be considered as a local search method, it mainly explores nodes which are close to the leaves, as illustrated in Figure
CNS can start its search from the root, that is, no variable has a value. In such a case, its first iterations would basically consist in greedily assigning a value to a variable until the current solution becomes
Therefore, CNS does not encounter the above-described drawbacks associated with tree search and standard local search methods. Notice however that there exists an implicit enumeration method able to perform jumps over
The main reference associated with this section is [
Given a graph
The best
In contrast, in
It is shown in [
Comparisons between CNS-GCP and state-of-the-art coloring algorithms.
Graph |
|
|
CNS-GCP | Tabucol | MMT | GH | MOR |
---|---|---|---|---|---|---|---|
DSJC1000.1 | 1000 | ?, 20 | 21 | 20 | 20 | 20 | 21 |
DSJC1000.5 | 1000 | ?, 83 | 89 | 89 | 83 | 83 | 88 |
DSJC1000.9 | 1000 | ?, 224 | 226 | 227 | 226 | 224 | 226 |
DSJC500.1 | 500 | ?, 12 | 12 | 12 | 12 | 12 | 12 |
DSJC500.5 | 500 | ?, 48 | 49 | 49 | 48 | 48 | 49 |
DSJC500.9 | 500 | ?, 126 | 127 | 127 | 127 | 126 | 128 |
DSJR500.1c | 500 | ?, 85 | 85 | 85 | 85 | — | 85 |
DSJR500.5 | 500 | ?, 122 | 126 | 126 | 122 | — | 123 |
flat1000_50_0 | 1000 | 50, 50 | 50 | 50 | 50 | 50 | 50 |
flat1000_60_0 | 1000 | 60, 60 | 60 | 60 | 60 | 60 | 60 |
flat1000_76_0 | 1000 | 76, 82 | 88 | 88 | 82 | 83 | 89 |
flat300_28_0 | 300 | 28, 28 | 28 | 31 | 31 | 31 | 31 |
le450_15c | 450 | 15, 15 | 15 | 16 | 15 | 15 | 15 |
le450_15d | 450 | 15, 15 | 15 | 15 | 15 | 15 | 15 |
le450_25c | 450 | 25, 25 | 27 | 26 | 25 | 26 | 25 |
le450_25d | 450 | 25, 25 | 27 | 26 | 25 | 26 | 25 |
A CPU time limit of 60 minutes on a Pentium 4 (2 GHZ, 512 MB of RAM) was considered for CNS-GCP. The first two columns of Table
The main reference associated with this section is [
The FAPP concerns a Hertzian telecommunication network made up of antennae located at a set of geographical sites. A Hertzian liaison joins two sites by one or more paths. Hence, a
Let
Let
Consequently, the objective function of the problem is, in order of priority to (1) minimize the lowest relaxation level
The strategy adopted for the resolution consists in transforming the FAPP optimization problem into 11 decision problems according to the relaxation level on the
A solution
A tabu list is needed to prevent cycling, which occurs when there is an attempt to instantiate the last deleted variables in the current partial solution. Indeed, all the values (
The considered problem was the subject of the Challenge ROADEF 2001 (organized by the French Society of Operations Research and Decision Analysis), involving 27 research teams (see
Comparison of CNS-FAPP with others methods.
FAPP |
|
|
|
|
CNS-FAPP | ||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
01-0200 | 4 | 4 | 56 | 4 | 6 | 279 | 4 | 14 | 165 | 5 | 1 | 281 | 4 | 14 | 233 |
02-0250 | 2 | 7 | 86 | 2 | 18 | 248 | 2 | 21 | 160 | 11 | 1 | 1274 | 2 | 20 | 195 |
03-0300 | 7 | 10 | 341 | 7 | 27 | 1076 | 7 | 16 | 420 | 7 | 13 | 589 | 7 | 32 | 892 |
04-0300 | 1 | 31 | 0 | 1 | 164 | 0 | 3 | 9 | 224 | 7 | 1 | 3678 | 1 | 184 | 0 |
05-0350 | 11 | 1 | 372 | 11 | 892 | 12364 | 11 | 1 | 1467 | 11 | 7 | 2284 | 11 | 364 | 5694 |
06-0500 | 5 | 12 | 246 | 5 | 53 | 1029 | 7 | 15 | 879 | 7 | 15 | 1210 | 5 | 31 | 811 |
07-0600 | 9 | 22 | 714 | 9 | 132 | 4419 | 10 | 28 | 3070 | 9 | 33 | 1585 | 9 | 106 | 3375 |
08-0700 | 5 | 16 | 266 | 5 | 53 | 1359 | 5 | 37 | 691 | 5 | 26 | 625 | 5 | 73 | 1225 |
09-0800 | 3 | 28 | 195 | 3 | 63 | 937 | 4 | 24 | 573 | 10 | 1 | 3678 | 3 | 104 | 846 |
10-0900 | 6 | 18 | 475 | 6 | 82 | 2365 | 6 | 39 | 1146 | 8 | 5 | 2871 | 6 | 103 | 2003 |
11-1000 | 8 | 8 | 1015 | 8 | 119 | 5206 | 9 | 30 | 3736 | 10 | 1 | 5108 | 8 | 119 | 4191 |
12-1500 | 3 | 83 | 1698 | 7 | 180 | 6538 | 11 | 17 | 2634 | 9 | 70 | 7682 | 2 | 62 | 1310 |
13-2000 | 3 | 49 | 2003 | 7 | 229 | 7503 | 11 | 59 | 6164 | 10 | 13 | 9651 | 5 | 132 | 3645 |
14-2500 | 4 | 35 | 3485 | 8 | 18 | 10661 | 11 | 3 | 5574 | 10 | 101 | 15718 | 5 | 217 | 5045 |
15-3000 | 5 | 15 | 1569 | 7 | 333 | 9988 | 11 | 46 | 9523 | 10 | 61 | 14010 | 5 | 192 | 4727 |
16-0260 | 11 | 5 | 56 | 11 | 572 | 5779 | 11 | 67 | 913 | 11 | 5 | 57 | 11 | 514 | 5189 |
17-0300 | 4 | 4 | 34 | 4 | 4 | 36 | 4 | 4 | 35 | 4 | 4 | 34 | 4 | 4 | 36 |
18-0350 | 8 | 4 | 55 | 8 | 4 | 55 | 8 | 4 | 57 | 8 | 4 | 55 | 8 | 4 | 59 |
19-0350 | 6 | 2 | 51 | 6 | 3 | 79 | 6 | 2 | 53 | 6 | 2 | 51 | 6 | 3 | 70 |
20-0420 | 10 | 5 | 97 | 10 | 6 | 145 | 10 | 5 | 99 | 10 | 5 | 97 | 10 | 7 | 142 |
21-0500 | 4 | 2 | 10 | 4 | 2 | 12 | 4 | 2 | 11 | 4 | 2 | 10 | 4 | 2 | 12 |
22-1750 | 7 | 15 | 187 | 7 | 16 | 356 | 7 | 16 | 194 | 7 | 15 | 187 | 7 | 25 | 503 |
23-1800 | 9 | 16 | 187 | 9 | 17 | 197 | 9 | 16 | 189 | 9 | 16 | 187 | 9 | 17 | 197 |
24-2000 | 7 | 6 | 71 | 7 | 7 | 90 | 7 | 7 | 79 | 7 | 6 | 71 | 7 | 9 | 91 |
25-2230 | 3 | 7 | 32 | 3 | 7 | 33 | 3 | 7 | 33 | 3 | 7 | 32 | 3 | 7 | 33 |
26-2300 | 7 | 9 | 74 | 7 | 10 | 81 | 7 | 9 | 74 | 7 | 9 | 74 | 7 | 10 | 86 |
27-2550 | 11 | 4 | 64 | 5 | 7 | 46 | 5 | 8 | 37 | 5 | 4 | 20 | 5 | 11 | 54 |
28-2800 | 3 | 13 | 32 | 3 | 32 | 129 | 3 | 25 | 58 | 3 | 13 | 32 | 3 | 42 | 142 |
29-2900 | 6 | 25 | 239 | 6 | 28 | 351 | 6 | 25 | 212 | 6 | 25 | 212 | 6 | 25 | 310 |
30-3000 | 11 | 1166 | 12029 | 7 | 17 | 602 | 7 | 16 | 190 | 7 | 13 | 148 | 7 | 48 | 1045 |
31-0400 | 5 | 4 | 1180 | 5 | 161 | 2131 | 5 | 34 | 1151 | 5 | 16 | 1400 | 5 | 117 | 1896 |
32-0550 | 10 | 52 | 1739 | 6 | 16 | 388 | 6 | 5 | 71 | 11 | 25 | 2166 | 6 | 10 | 235 |
33-0650 | 5 | 7 | 66 | 5 | 16 | 332 | 5 | 7 | 77 | 11 | 5 | 1310 | 5 | 10 | 235 |
34-0750 | 4 | 2 | 46 | 4 | 35 | 767 | 4 | 6 | 213 | 10 | 1 | 1701 | 4 | 22 | 565 |
35-1500 | 7 | 3 | 1280 | 6 | 74 | 1919 | 6 | 16 | 431 | 11 | 24 | 5870 | 6 | 62 | 1375 |
36-2000 | 7 | 99 | 2153 | 9 | 3 | 2478 | 8 | 25 | 970 | 11 | 16 | 4652 | 7 | 63 | 1643 |
37-2250 | 11 | 3 | 12229 | 5 | 56 | 1745 | 8 | 13 | 975 | 11 | 14 | 10353 | 5 | 51 | 1288 |
38-2500 | 11 | 79 | 14058 | 3 | 39 | 572 | 3 | 14 | 174 | 11 | 53 | 13355 | 9 | 125 | 6717 |
39-2750 | 3 | 356 | 2844 | 3 | 2567 | 10470 | 3 | 747 | 4603 | 11 | 36 | 13267 | 11 | 3947 | 40473 |
40-3000 | 11 | 39 | 16755 | 4 | 77 | 1562 | 8 | 20 | 1261 | 11 | 867 | 13684 | 4 | 64 | 1252 |
The first approach, developed by Bisaillon’s team and referred to as
We can observe the efficiency of CNS-FAPP when compared to the other methods. Care is needed because the indicated values are the best among 10. CNS-FAPP finds the optimal
The main reference associated with this section is [
The problem retained for the Challenge ROADEF 1999 was an inventory management problem (see
Two types of maintenance constraints make the problem difficult: (1) a maximum time of use without maintenance is given for each car type (each maintenance has a duration, a cost and a number of workers needed to perform it); (2) the company has a fixed number of maintenance workers, which means that the maintenances should be scheduled so that the capacity of the workshop is never exceeded. In addition, the following costs are also known: the costs (fixed and time dependent) associated with the assignment of a car to a request, and the inventory cost per day of a car in stock (rented or not). The goal is to satisfy all the requests while minimizing the total cost.
The general pseudocode of the method, denoted CNS-CAR, is summarized in Algorithm
1. try to improve 2. update are initially subcontracted).
In TS1-CAR, a solution
TS2-CAR is an extension of TS1-CAR in the following sense: (1) it works on several car types during the same move; (2) it tries to reduce the total cost not only by assigning cars to subcontracted requests, but also by avoiding upgrades; (3) a reassignment phase is performed; (4) the repairing phase has more options to validate the move proposed in the assignment phase. A neighbor
Diversification procedures were also used, based on the following idea: the requests which were not subcontracted from a large number of iterations are subcontracted, in order to make room for other requests in the schedule.
In Tables
Results for the instances without purchase.
Instance | Best | BB | AGHKU | B | DD | CNS-CAR |
---|---|---|---|---|---|---|
(80, 8, 2, |
|
0.00% | 0.05% | 1.31% | 8.47% | 0.00% |
(150, 7, 2, |
|
0.87% | 5.85% | 5.03% | 9.73% | 0.00% |
(160, 12, 2, |
|
14.63% | 19.68% | 29.56% | 20.51% | 0.81% |
(200, 12, 2, |
|
7.77% | 22.56% | 25.52% | 26.02% | 2.59% |
(200, 7, 2, |
|
6.36% | 12.61% | 21.02% | 31.93% | 3.62% |
(200, 7, 4, |
|
0.00% | 0.66% | 3.26% | 14.45% | 0.00% |
(210, 9, 2, |
|
5.67% | 10.76% | 18.48% | 27.11% | 2.42% |
(210, 9, 4, |
|
1.82% | 1.44% | 3.46% | 13.09% | 0.12% |
| ||||||
Average | 4236008 | 4.64% | 9.20% | 13.46% | 18.91% | 1.20% |
Results for the instances with purchase.
Instance | Best | BB | AGHKU | B | DD | CNS-CAR |
---|---|---|---|---|---|---|
(80, 8, 2, |
|
0.00% | 1.55% | 2.82% | 7.23% | 0.04% |
(150, 7, 2, |
|
0.00% | 9.40% | 3.98% | 13.23% | 0.01% |
(160, 12, 2, |
|
11.99% | 29.40% | 26.08% | 21.98% | 1.47% |
(200, 12, 2, |
|
12.42% | 34.97% | 35.93% | 38.13% | 4.88% |
(200, 7, 2, |
|
6.88% | 15.36% | 27.76% | 31.38% | 4.98% |
(200, 7, 4, |
|
0.00% | 3.00% | 3.46% | 12.76% | 0.01% |
(210, 9, 2, |
|
7.38% | 18.91% | 29.31% | 34.15% | 3.52% |
(210, 9, 4, |
|
8.71% | 10.92% | 9.19% | 14.91% | 0.01% |
| ||||||
Average | 3787302 | 5.92% | 15.44% | 17.32% | 21.72% | 1.86% |
The algorithm was run with the time limit equivalent to one hour on a Pentium Pro (200 MHZ, 64 MB of RAM), as imposed by the organizers of the challenge. The results are shown in Table
The main reference associated with this section is [
The considered satellite range scheduling problem can be described as follows [
The SAT is to find a subset
A capacity constraint is the following. A size is associated with each photograph
Binary constraints involving the nonoverlapping of two trials and the minimal transition time between two successive trials of a camera, and also some constraints involving limitations on instantaneous data flow are conveniently expressed by simple relations over two pairs (photo and camera). A binary constraint forbids the simultaneous presence of a pair
Some constraints involving limitations on instantaneous data flow cannot be expressed in the form of binary constraints as above. These remaining constraints may however be expressed by relations over three pairs (photo and camera). A ternary constraint forbids the simultaneous presence of three pairs
Finally, we need to be sure that a schedule contains no more than one pair from
In contrast with the previous problems where all the constraints are considered to define the search space, a partially constrained search space C is considered here, which is composed of all binary vectors of there is only one for the above for the above
Thus, a neighbor of
During the search, the capacity constraint may be violated by the current solution
Each time a move is carried out, a single variable
Experiments are carried out on a set of 20 realistic instances provided by the CNES (French National Space Agency) and described in details in [
The best known nonexact algorithm was a tabu search TS-SAT proposed by the CNES. The main differences with CNS-SAT algorithm are the following. (1) TS-SAT uses a different (integer) formulation of the problem; (2) it manipulates only feasible solutions (the search space is thus
To solve an instance, CNS-SAT is allowed to run 9 million iterations on a PC (200 MHZ, 32 MB of RAM), which is considered as reasonable by the CNES. CNS-SAT was run 100 times on each instance with different random seeds, and the average value is returned for each instance. The first three columns of Table
Comparison between TS-SAT and CNS-SAT.
Instance |
|
|
|
TimeTS |
|
TimeCNS |
---|---|---|---|---|---|---|
1401 | 488 | 914 | 174058 | 846 | 176055 | 120 |
1403 | 665 | 1317 | 174137 | 1324 | 176134 | 332 |
1405 | 855 | 1815 | 174174 | 1574 | 176175 | 1314 |
1021 | 1057 | 2355 | 174238 | 2197 | 176241 | 2422 |
1504 | 605 | 1253 | 124238 | 1011 | 124241 | 405 |
1506 | 940 | 2060 | 165244 | 1945 | 168224 | 1423 |
| ||||||
Average | 164348.17 | 1272.86 | 166178.33 | 859.57 |
In this paper, we propose and discuss a generic method for combinatorial optimization problems. Its consideration within various fields shows that CNS is very efficient, robust, quick, and relatively easy to implement. Note that other heuristic solution methods which were not discussed here could be considered within the CNS framework (e.g., [
CNS is especially well adapted when the optimization problem can be divided into a series of decision problems. It was the case for three of the presented applications. Graph coloring can be tackled by the search of The frequency assignment problem can be approached at level The car fleet management problem can be considered with a fixed number
CNS is a very flexible method for at least four reasons. It can manage various ways of representing a solution; the component It can consider various types of constraints. It is well adapted for constraints linking two or three variables together, because the repairing phase is usually straightforward in such situations. If a specific constraint involves several variables, such a constraint can be relaxed (at least to save CPU time), as it was the case for the satellite range scheduling problem when the capacity constraint was only considered at specific iterations. It is also well adapted for some problems where the unassigned variables actually correspond to an expensive assignment for the considered problem (e.g., a nonassigned variable corresponds to a subcontracted request for the car rental company). Other significant ingredients can be easily added within the framework of CNS to enhance its efficiency, such as intensification or diversification procedures.
CNS can be easily combined with evolutionary heuristics, like genetic or adaptive memory algorithms. We can consider that it was already successfully performed for graph coloring [