A Local Search Algorithm for the Flow Shop Scheduling Problem with Release Dates

This paper discusses the flow shop scheduling problem to minimize the makespan with release dates. By resequencing the jobs, a modified heuristic algorithm is obtained for handling large-sized problems. Moreover, based on some properties, a local search scheme is provided to improve the heuristic to gain high-quality solution for moderate-sized problems. A sequence-independent lower bound is presented to evaluate the performance of the algorithms. A series of simulation results demonstrate the effectiveness of the proposed algorithms.


Introduction
In a flow shop scheduling model, each job must be processed on a set of machines in identical order.The goal is to determine the job sequence to optimize a certain predetermined objective function.At any given time, each machine can process at most one job, and each job can be handled by at most one machine.Meanwhile, each job cannot be preempted by the other jobs.Flow shop scheduling problems widely exist in industrial production and mechanical manufacturing.For example, in a steel-making process, molten steel is casted into semifinished slabs by a conticaster; after being heated by the heat furnace, the slabs are rolled into products in rolling mill.Obviously, it is a typical flow shop production model.As most of the problems are strongly NP-hard, it is impossible to obtain the global optimum solution in polynomial time.So the study of flow shop scheduling algorithms is very important for reducing running time and boosting productivity.
Since the first scheduling rule was presented by Johnson [1] for the two-machine flow shop problem with objective of makespan (i.e., the maximum completion time) minimization, many works have been conducted on this research area.A comprehensive survey of flow shop makespan problems by 2010 can be found in Potts and Strusevich [2] or Bai and Ren [3].The up-to-date research works are mentioned as follows.
A. Rudek and R. Rudek [4] proved the ordinary NP-hardness for two-machine flow shop makespan problem when job processing times are described by nondecreasing position dependent functions (aging effect) on at least one machine and indicated the strong NP-hardness if job processing times are varying on both machines.Aydilek and Allahverdi [5] presented a polynomial time heuristic algorithm for the twomachine flow shop makespan problem with release dates.For the minimizing makespan in an -machine flow shop with learning considerations problem, Chung and Tong [6] proposed a dominance theorem and a lower bound to accelerate the branch-and-bound algorithm for seeking the optimal solution.For the criterion of makespan in flow shop model, a high-performing constructive heuristic with an effective tie-breaking strategy was proposed by Ying and Lin [7] to improve the quality of solutions.Similarly, Gupta et al. [8] proposed an alternative heuristic algorithm that is compared with the benchmark, Palmer's, CDS, and NEH algorithms, to solve -job and -machine flow shop scheduling problem with minimizing makespan.For the result of the job-related criterion, Bai [9] presented the asymptotic optimality of the shortest processing time-based algorithms for the flow shop problem to optimize total quadratic completion time with release dates.Bai and Zhang [10] extended the results to a general objective, total -power completion time ( ≥ 3).
In this paper, the flow shop scheduling problem for the minimization of makespan with release dates is addressed.Contrary to the static setting in which the jobs are simultaneously available, jobs arrive over time according to their release dates, which more closely approaches practical scheduling environments.Lenstra et al. [11] proved that the two-machine flow shop makespan problem with release dates is strongly NP-hard.It implies that the optimal solution of this problem cannot be found in polynomial time; heuristic algorithm may be more effective to obtain an approximate solution for largesized problems.Therefore, a new modified GS algorithm (MGS) based on the algorithm of Gonzalez and Sahni [12] is presented for slow shop minimizing makespan with release dates.Then an improved scheme is provided to promote performance of the MGS algorithm.Moreover, a sequenceindependent lower bound of the problem is presented.Computational experiments reveal the performances of the MGS algorithm, improved scheme, and lower bound in different size problems.
The remainder of this paper is organized as follows.The problem is formulated in Section 2, and the MGS algorithm and improved scheme are provided in Sections 3 and 4, respectively.The new lower bound and computational results are given in Section 5.This paper closes with the conclusion in Section 6.

Problem Statement
In a flow shop problem, a set of  jobs has to be processed on  different machines in the same order.Job ,  = 1, 2, . . ., , is processed on machines ,  = 1, 2, . . ., , with a nonnegative processing time (, ) and a release date   , which is the earliest time when the job is permitted to process.Each machine can process at most one job and each job can be handled by at most one machine at any given time.The machine processes the jobs in a first come, first served manner.The permutation schedule is considered in this paper, and the intermediate storage between successive machines is unlimited.The completion time of job ,  = 1, 2, . . ., , on machine ,  = 1, 2, . . ., , is denoted by (, ).The goal is to determine a job sequence that minimizes the makespan, that is, min{max 1≤≤ (, )}.

The Modified GS Algorithm
Gonzalez and Sahni [12] presented the GS algorithm to solve the flow shop makespan problem.Based on its idea, a new heuristic named modified GS (MGS) algorithm is presented to deal with the flow shop makespan problem with release dates.A formal expression of the MGS algorithm is presented as follows.

The MGS Algorithm
Step 1. Divide the  machines into  − 1 groups.
Step 3. Wait until a job arrives and go to Step 2. If all the jobs have been scheduled, go to Step 4.
Step 4. Terminate the program and calculate the objective values   of the schedules.Select the minimum one as the final solution,  max = min 1≤≤−1 {  }.
The flowchart of the algorithm is shown in Figure 1.An example is proposed to show the execution of the MGS algorithm.
Example 1.A flow shop scheduling problem involves three machines,  1 ,  2 , and  3 , and four jobs,  1 ,  2 ,  3 , and  4 with release dates.The release dates and processing times of the jobs are listed below.The objective function of the problem is  max .Consider The final sequence of the MGS algorithm is { 1 ,  3 ,  4 ,  2 }.And the objective value is  max = 29.The scheduling process is shown in Figure 2.

The Improved Scheme
To further promote the quality of the solution for mediumscale problems, some properties for two-machine flow shop then the optimal sequence is that job  is scheduled before job ℎ.
Property 3.For problem 2|  | max , if two adjacent jobs ℎ and  satisfy then the optimal sequence is that job  is scheduled before job ℎ.
With these properties, an improved scheme is provided to promote the original solution obtained by the MGS algorithm.In a formal expression of the improved scheme,  0,0 (),  = 1, 2, . . ., , denotes the job found in the th position of the original sequence  0,0 , () denotes the number of comparisons that job  0,0 () sequentially compares forward with the th job to a given job in a seed sequence,  denotes the number of comparisons from the current job to the end job,  denotes the number of groups, and Π , () denotes a sequence set that is generated by exchanging the th job of  0,0 with each different job in a seed sequence.The format expression can be summarized as follows.

Improved Scheme
Step 1. Generate the initial sequence  0,0 with the MGS algorithm and calculate the objective value  0,0 .

Computational Results
In this section, a series of computational experiments are designed to reveal the performances of the proposed algorithm and improved scheme in different size problems.Ten different random tests for each combination of the parameters were, respectively, performed, and the averages are reported.All the algorithms were coded in MATLAB R2012a and implemented on a PC with an Intel Core i7-2600 CPU (3.4 GHz × 4) and 4 GB RAM.The processing times of the jobs were randomly generated from a discrete uniform distribution on [1,10] and a discrete normal distribution with expectation 5.5 and variance 1.7 2 .
To evaluate the performance of an algorithm for problem |  | max , Bai et al. [13] presented a lower bound (LB1): However, the value of LB1 sometimes may be larger than the optimal value because LB1 is sequence-dependent.Consider the following example.
Example 2. A two-machine flow shop scheduling problem involves two jobs,  1 and  2 , with release dates.The release dates and processing times of the jobs are listed below.The objective of the problem is  max : For sequence { 1 ,  2 }, calculate the value of LB1: And the optimal schedule of the problem is { 2 ,  1 }.The associated optimal value is Obviously, LB1 >  max (OPT).
To guarantee that the lower bound value is strictly less than the optimal value, a new lower bound (LB2) is provided: Obviously, the last two items in LB2 guarantee that the lower bound is independent of the job sequence.Theorem 3. Let the processing times (, ),  = 1, 2, . . ., ,  = 1, 2, . . ., , be independent random variables having the same continuous distribution with bounded density defined on (0, 1].Then, for every ,  = 1, 2, . . ., , with probability one, Proof.Without loss of generality, Dividing  on both sides of (28) and taking limit, Bai et al. [13] proved that lim Combining ( 29) and (30) yields the result of Theorem 3.
Calculate Example 2 with LB2: 5.1.Tests for the MGS Algorithm.Several numerical tests are conducted to reveal the effectiveness of the MGS algorithm.Three, five, and ten machines and 50, 100, 200, 500, and 1000 jobs are tested, respectively.The release dates are drawn from a discrete uniform distribution on [1,   ], where  is the number of jobs and   is a multiplier with the values of 1, 2, 5, and 8.The DSJF heuristic algorithm presented by Bai and Tang [14] is used as a reference for comparison.First, in Tables 1 and 2, we compare the performance of the MGS algorithm and DSJF heuristic by employing mean relative percentage ( MGS −  DSJF )/ DSJF × 100%, where  MGS is the objective value of the MGS algorithm and  DSJF is the objective value of the DSJF heuristic.
In Tables 1 and 2, the data show that the performance of the two algorithms is dependent on multiplier   .To further determine the dominance of the two algorithms, in Tables 3 and 4, we execute the following experiments with mean relative percentage /20 × 100%, where  denotes the times when the DSJF heuristic dominates the MGS algorithm.The results of Tables 3 and 4 indicate that the DSJF heuristic completely dominates the MGS algorithm as   = 1 and the opposite case as   = 8.Therefore, in the two cases, obtaining the near-optimal solution with the proper one of the algorithms directly is more practical.To demonstrate the asymptotic optimality of the MGS algorithm, we compare its

Figure 1 :
Figure 1: The flowchart of the MGS algorithm.

Table 6 :
Asymptotic performance tests for MGS (normal distribution %).value with the associated value of LB2.The mean relative percentage ( A1 −  LB2 )/ LB2 × 100% is employed, where  A1 is the objective value of the MGS algorithm and  LB2 is the objective value of LB2. objective