A VNS Metaheuristic with Stochastic Steps for Max 3-Cut and Max 3-Section

A heuristic algorithm based on VNS is proposed to solve the Max 3-cut and Max 3-section problems. By establishing a neighborhood structure of the Max 3-cut problem, we propose a local search algorithm and a variable neighborhood global search algorithm with two stochastic search steps to obtain the global solution. We give some numerical results and comparisons with the well-known 0.836-approximate algorithm. Numerical results show that the proposed heuristic algorithm can obtain efficiently the high-quality solutions and has the better numerical performance than the 0.836-approximate algorithm for the NP-Hard Max 3-cut and Max 3-section problems.


Introduction
Given a graph G V ; E , with nodes set V and edges set E, the Max 3-cut problem is to find a partition S 0 ⊂ V , S 1 ⊂ V and S 2 ⊂ V , of the set V , such that S 0 S 1 S 2 V , S i S j ∅ i / j and the sum of the weights on the edges connecting the different parts is maximized.Similar to the Max cut problem, the Max 3-cut problem has long been known to be NP complete 1 , even for any un-weighted graphs 2 , and has also applications in circuit layout design, statistical physics, and so on 3 .However, due to the complexity of this problem, its research progresses is much lower than that of the Max cut problem.Based on the semidefinite programming relaxation proposed by Goemans and Williamson 4 , Frieze and Jerrum 5 obtained a 0.800217-approximation algorithm for the Max 3-Cut problem.Recently, Goemans and Williamson 6 and Zhang and Huang 7 improved Frieze and Mathematical Problems in Engineering Jerrum's 0.800217-approximation ratio to 0.836 using a complex semidefinite programming relaxation of the Max 3-cut problem.
For the purpose of our analysis, we first introduce some notations.We denote the complex conjugate of y a ib by y a − ib, where i √ −1 is the pure image number and the real part and image part of a complex number by Re • and Im • , respectively.For an n dimensional complex vector y ∈ C n written as bold letter and n dimensional complex matrix Y ∈ C n×n , we write y * and Y * to denote their conjugate and transpose.That is, y * y T and Y * Y T .The set of n dimensional real symmetric semidefinite positive matrices and the set of n dimensional complex Hermitian semidefinite positive matrices are denoted by S n S n and H n H n , respectively.We sometimes use A 0 to show A ∈ S n or A ∈ H n .For any two complex vector u, v ∈ C n , we write u, v u • v u * v as their inner product.For any two complex matrices A, B ∈ H n , we write A, B A • B as their inner product; that is, A, B A • B Tr B * A i,j b ij a ij , where A a ij and B b ij .• means the module of a complex number or the 2-norm of a complex vector or the F-norm of a complex matrix.
By relaxing the complex variable y i into an n dimensional complex vector y i , we get a complex semidefinite programming CSDP relaxation of M3C as follows: i and e i denotes the vector with zeros everywhere except for an unit in the ith component.It is easily to verify that constraints To get an approximate solution of M3C, Goemans and Williamson 6 do not directly solve the CSDP, but solve an equivalent real SDP with following form Although some softwares, such as SeDuMi 8 and the earlier version of SDPT3-4.0 9 , can deal with SDPs with complex data, this does not reduce the dimensions of problems : where Q 1/3 diag We − W is the Laplace matrix of given graph, O is an n-dimensional full zeros matrix.
In RSDP, the first, third, and forth classes of equality constraints ensure that X ii 1, i 1, 2, . . ., n and with the form The final two classes of equality constraints ensure that S ii 0 i 1, . . ., n and S is a skewsymmetric matrix.

Mathematical Problems in Engineering
If X is an optimal solution of RSDP, then the complex matrix Y R iS is an optimal solution of CSDP.Then one can randomly generate a complex vector ξ, such that ξ ∼ N 0, Y , and set where Arg • ∈ 0, 2π means the complex angle principal value of a complex number.Goemans and Williamson 6 had verified that, see also Zhang and Huang 7 , The algorithm proposed by Goemans and Williamson 6 can obtain a very good approximate ratio, and RSDP can be solved by interior point algorithm, but the 0.836approximate algorithm will be not practical in numerical study for the Max 3-cut problem.From RSDP, one can see that for a graph with n nodes, RSDP has 2n 5n n − 1 /2 constraints and 3n n − 1 /2 slack variables via the inequality constraints.That is to say, RSDP has a 2n dimensional unknown symmetrical semidefinite positive matrix variable and a 3n n − 1 /2 dimensional unknown vector variable, and 2n 5n n − 1 /2 constraints, and has also many matrices without an explicit block diagonal structure although they are sparse.For instance, when n 100, RSDP becomes a very-high-dimensional semidefinite programming problem with 14850 slack variables and 24950 constraints.Further, as we known, it is only a class of universal and medium-scale instances for Max 3-cut problems with 50 to 100 nodes.Hence, it will be very time consuming to solve such a RSDP relaxation of M3C using the current existing any SDP softwares.This leads that 0.836-approximate algorithm is not suitable for computational study of the Max 3-cut problem.This limitation for solving M3C based on CSDP or RSDP relaxation motivates us to find a new efficient and fast algorithm for the practical purpose for the Max 3-cut problem.
In the current paper, we first establish a definition of K-neighborhood structure of the Max 3-cut problem and design a local search algorithm to find the local minimizer.And then, we propose a variable neighborhood search VNS metaheuristic with stochastic steps which is originally considered by Mladenović and Hansen 10 , by which we can find efficiently a high-quality global approximate solution of the Max 3-cut problem.Further, combining a greed algorithm, we extend the proposed algorithm to the Max 3-section problem.To the best of our knowledge, it is first time to consider the computational study of the Max 3cut problem.In order to test the performance of the proposed algorithm, we compare the numerical results with Goemans and Williamson's 0.836-approximate algorithm.
This paper is organized as follows.In Section 2, we give some definitions and lemmas.In Section 3, we present the VNS metaheuristic for solving the Max 3-cut problem.The VNS is extended to the Max 3-section problem in Section 4. In Section 5, we give some numerical results and comparisons.

Preliminaries
In this section, we will establish some definitions and give some facts for our sequel purpose.For the third roots of unity, 1, ω, ω 2 , we can get the following fact: Denote S {1, ω, ω 2 } n .Then based on 2.1 , for any y ∈ S, we may definite a K-neighborhood of y as follows.
Definition 2.1.For any y ∈ S and any positive integer number K 1 ≤ K ≤ n , one defines the K-neighborhood, denoted by N K y , of y as the set In particular, if K 1, we write the 1-neighborhood N 1 y of y as N y .
The boundary of the K-neighborhood N K y is defined by 1, the difference of between points y and its K-neighbor z is that they have only K different components.By computing straightforwardly, we get the number of elements of Definition 2.3.For any u ∈ {0, 1, 2}, define two maps from {1, ω, ω 2 } to itself as follows: Clearly, for any u ∈ {0, 1, 2}, τ i ω u / ω u , i 1, 2 and τ 1 ω u / τ 2 ω u .Applying Definition 2.3, for any z ∈ N y there exists an unique component, z k say, of z, such that z k / y k and either z k τ 1 y k or z k τ 2 y k , and other components of z and y are the same.For simplicity, for any z ∈ N y with z k / y k and z i y i i 1, . . ., n, i / k , we denote by z τ k 1 y or z τ k 2 y corresponding to z k τ 1 y k or z k τ 2 y k .By Definitions 2.1 and 2.3, for any y ∈ S, we can structure its 1-neighborhood points using maps defined by Definition 2.3; that is, we have the following result.

Local Search Algorithm
Let y 0 y 0 1 , . . ., y 0 n T ∈ S be a feasible solution of problem M3C.If y 0 is not a local maximizer of f, then for all y ∈ N y 0 , we may find a y ∈ N y 0 , such that f y max{f y : y ∈ N y 0 }.It is clear that f y ≥ f y 0 .If y is not still a local maximizer of f, then replacing y 0 with y and repeating the process until a point y satisfying f y max{f y : y ∈ N y } is found, which indicates that y is a local maximizer of f.
For any positive integer number Then, we have the following result whose proof is clear.

3.3
Based on Lemma 3.1, if we know the value of f y 0 , then we can obtain the value function f y k at next iterative point y k by calculating δ k by 3.3 , instead of calculating directly the values f y k , which reduces sharply the computational cost.By Definition 2.1, there exist two points satisfying 3.1 for fixed k; that is, when y k ∈ N y 0 and 3.For LSM3C, one has the following.
1 Input any initial feasible solution y 0 of problem M3C .
3 Find δ i * k * by the following way: 4 If δ i * k * ≥ 0, then set y y 0 , return y, and stop.Otherwise, go to next.

Variable Neighborhood Stochastic Search
Let y be a local maximizer obtained by LSM3C and K max 1 < K max ≤ n a fixed positive integer number.we now describe the variable neighborhood search VNS with stochastic steps, by which we can find an approximate global maximizer of problem M3C .The proposed VNS algorithm actually has three phases: First, for any given positive integer number K < K max , a K-neighborhood point, y say, is randomly selected; that is, y ∈ N K y .Next, a solution, y say, is obtained by applying algorithm LSM3C to y.Finally, the current solution jumps from y to y if it improves the former one.Otherwise, the order K of the neighborhood is increased by one when K < K max and the above steps are repeated until some stopping condition is met.The VNS that is also called k-max 11 can be illustrated as follows.
For VNS-k, one has the following.
1 Arbitrary choose a point y 0 ∈ S, implement LSM3C starting from y 0 ∈ S and denote the obtained local maximizer by y.Set K 1.
2 Randomly take a point y ∈ ∂N I K y and implement again LSM3C from y, and denote the obtained new local maximizer by y.
3 If f y > f y , then set y y and K 1; go to Step 2.
4 If K < K max ≤ n , set K K 1; go to Step 2. Otherwise, return y as an approximate global solution of problem M3C and stop.
The subscript I K in Step 2 is a function of K and is also a positive integer number not greater than n.I K reflects the main skill of converting the current neighborhood of local maximizer y into another neighborhood of y.For a given K max , let m n/K max and K 0 n − mK max , where a means the integral part of a.We divide the n neighborhoods of y, N y , N 2 y , . . ., N K y , . . ., N n y into K max neighborhood blocks N I 1 y , . . ., N I K max y , such that, for K 1, 2, . . ., K max − K 0 , In order to obtain the K max neighborhood blocks of y, N I K y , K 1, . . ., K max , we divide the set {1, 2, . . ., n} into K max disjoint subsets, where each subset of the first K max − K 0 subsets has m integers and each subset of the last K 0 subsets has m 1 integers.For any integer K ≤ K max , let 3.8 Then we can randomly choose a point y in ∂N I K y , where c ∈ 0, 1 is a random number from uniformly distribution U 0, 1 , such that N I K y satisfies 3.5 or 3.6 .VNS-k stops when the maximum K neighborhood is reached.Additionally, we also consider another termination criterion of VNS based on the maximum CPU-time and denoted by VNS-t.VNS-t can obtain a better solution than VNS-k since VNS-t actually runs several times VNS-k in the maximum allowing time t max , but it generally has to spend more computational time.The VNS-t can be stated as follows.
For VNS-t, one has the following.
1 Set t CPU 0, running VNS-k for an arbitrary initial point y 0 ∈ S, and let a local optimal solution y be obtained.We mention that it differs from the classical variable neighborhood search metaheuristic that is originally proposed by Mladenović and Hansen 10 .In order to obtain a global optimal solution or a high-quality approximate solution of problem M3C, we use two stochastic steps in VNS.First, for a fixed K, a K-neighbor of y is chosen randomly.Second, by the definition of I K , when we change the neighborhood of y from N I K−1 to N I K , N I K may take any a neighborhood among N K−1 m j , j 1, 2, . . ., m of y, which is decided by random number c.In VNS, positive integer K max decides the maximum search neighborhood block of y, which also decides directly the CPU-time of VNS.Based on the second stochastic step, we may choose a relative small K max comparing with n.This can decrease our computational time.

A Greedy Algorithm for Max 3-Section
When the number of nodes n is a multiple of three and the condition |S 0 | |S 1 | |S 2 | n/3 is required, the Max 3-cut problem becomes the Max 3-section problem.Notice that 1 ω ω2 0, then the Max 3-section problem can be formulated as the following programming problem M3S: and its CSDP relaxation is where e is the column vector of all ones.Andersson 12 extended Frieze and Jerrum's random rounding method to M3S and obtained a 2/3 O 1/n3 -approximate algorithm, which is the current best approximate ratio for M3S; also see the recent research of Gaur et al. 13 .The author of the current paper considers a special the Max 3-Section problem and obtains a 0.6733-approximate algorithm; see Ling 2009 14 .
Clearly, the feasible region of problem M3S is a subset of S, and the optimal value of problem M3S is not greater than that of problem M3C.Assume that we have get a global optimal solution or a high-quality approximate solution y of problem M3C.It is clear that y may not satisfy the condition n i 1 y i 0. But we may adjust y to get a new feasible solution y s using a greedy algorithm, such that y s satisfies n i 1 y s i 0. This is the motivation that we propose the greedy algorithm for the Max 3-section problem.
For the sake of our analysis, without loss of generality, we assume that the local maximizer y satisfies The resulted new solution y N y N 1 , . . ., y N i will not change the objective value since f y f w k y k / 0, k 1, 2 ; moreover, the new partition The sizes adjusting greedy algorithm of Cases 3 and 4 are similar to Cases 1 and 2. Hence, we mainly consider Cases 1 and 2 for adjusting the partition of V from S {S 0 , S 1 , S 2 } to S { S 0 , S 1 , S 2 } such that | S k | n/3, k 0, 1, 2. Denote

4.3
Then, it follows from simple computation that In what follows, we describe the size adjusting greedy algorithms SAGAs for Cases 1 and 2, and denote the greedy algorithms for the two cases by SAGA1 and SAGA2, respectively.
For SAGA1, one has the following.

and renew to calculate
where where q 1 n/3 − |S 1 |.Renew to calculate and let where where where q 2 n/3 − |S 2 |.Renew to calculate and let where 4 Return the current partition S { S 0 , S 1 , S 2 }; stop.

Numerical Results
This section describes the obtained experimental results for some instances of Max 3cut and Max 3-Section problems using the proposed VNS metaheuristic.We also show a quantitative comparison with 0.836-approximate algorithm.The computational experiments are performed in an Intel Pentium 4 processor at 2.0 GHz, with 512 MB of RAM, and all algorithms are coded in Matlab.Because RSDP relaxation of M3C includes many slack variables, many constraints, and matrices variables without a block diagonal structures, in our numerical comparisons, we choose SDPT3-4.0 9 , one of the best and well-known solvers of semidefinite programming, to solve RSDP relaxation of M3C.
All our test problems are generated randomly by the following way.Let p ∈ 0, 1 be a constant and r ∈ 0, 1 a random number.If r ≤ p, then there is an edge between nodes i and j with weight w ij , that is, a random integer between 1 and 10.Otherwise, w ij 0; that is, there is no edge between nodes i and j.Because of the limits of memory of SDPT3, when n > 200, RSDP becomes a huge semidefinite programming problem with not less than 59700 slack variables and 99900 constraints and is out of memory of SDPT3.Hence, in the numerical experiments, we consider 30 instances with p 0.1, 0.3, 0.6, and n varying from 20 to 200.
Firstly, we check the influence of K max on the quality of solution obtained by VNS-k.For a given graph, we take K max 3, 5, 10, 15, 30; Table 1 presents the results, where Wnp in the first column of this table and the following tables means that a graph is randomly generated with nodes n and density p; for instance, W30.6 presents a graph generated randomly with n 30 and p 0.6.We find from Table 1 that the influence of K max to objective value denoted by Obj in Table 1 is slight when K max > 5, but the CPU time increases sharply as K max increases.This result is actually not surprising.Indeed, because I K > K, we choose

Conclusions
A variable neighborhood stochastic metaheuristic was proposed to solve the Max 3-cut and Max 3-section problems in this paper.Our algorithms can solve Max 3-cut and Max 3section problems with different sizes and densities.Although 0.836-approximate algorithm 1 is satisfied, then either y k k τ 1 y 0 k or y k k τ 2 y 0 k .For our convenience, we denote δ k by δ 1 k when y k k τ 1 y 0 k and by δ 2 k when y k k τ 2 y 0 k .In what follows, we describe the local search algorithm for the Max 3-cut problem denoted by LSM3C; by this algorithm, we can get a local maximizer of functionf y over S.

2
If K K max ≤ n , go to Step 3.
3 If t CPU < t max , then set K 1; go to Step 2 in VNS-k.Otherwise, return y as an approximate global solution of problem M3C and stop.

Table 1 :
The objective value obtained by VNS for M3C with different K max . in ∂N I K y , instead of ∂N K y .This avoids to choose too large K max which leads to more CPU-time cost.Hence, in sequel numerical comparisons, we fix K max 5 for all test problems.Secondly, we compare VNS VNS-k, VNS-t metaheuristic with 0.836-approximate algorithm for all test problems.To avoid the effect of initial points, for each test problem, after RSDP is solved, we run the round procedure of 0.836-approximate algorithm and VNS metaheuristic ten times, respectively.Table2gives the result of numerical comparisons.In the numerical presentations of

Table 2 ,
Obj rsdp is the optimal value of problem RSDP; that is, it is an upper bound of M3C.Obj GM is the largest value obtained by 0.836-approximate algorithm in the ten tests.Obj vns stands for the largest value obtained by VNS for M3C in the ten tests, respectively.mands.v. are the number of constraints and slack variables s.v., respectively.tGMandtvns−karetheaverage time second associated with the two algorithms in the ten tests.For the maximum CPU time of VNS-t, we take t max 2t vns−k , but the real CPU time of VNS-t will be greater than t max .Additionally, for measuring the performance of solutions, we take ρ Obj vns − Obj rsdp Clearly, ρ can reflect how close to the solution obtained by VNS from the optimal solution of RSDP.One can see from Table2that 1 the VNS metaheuristic not only can obtain a better solution than 0.836-approximate algorithm for all test problems, but also that the elapsed CPU-time of VNS metaheuristic is much less than that of 0.836-approximate algorithm for all test problems, 2 the performance of solution can be improved by VNS-t for most of test problems when the termination criterion of VNS is based on the maximum CPUtime, but VNS-t spends more computational time than VNS-k.The improved performance can be reflected by ∇ρ ρ t − ρ k in the final column of Table2.Average speaking, VNS-t improves 0.91 percentage point.Finally, we consider the solution of M3S by combining VNS-k and greedy sizesadjusted algorithm SAGA stated in Section 4. Let y be an approximate solution of M3C obtained by VNS; we can obtain an approximate solution of M3S from SAGA.The numerical results are reported by Table3in which Obj vns saga stands for the largest value obtained by VNS-k plus SAGA for M3S.Although our sizes-adjusted algorithm may decrease the objective value obtained by VNS, the changes of objective values are very slight from

Table 2 :
The numerical comparisons of 0.836-approximate algorithm with VNS metaheuristic.

Table 3 .
Particular, objective values of some problems do not decrease, instead increase, such as W150.3.We do not compare the obtained results with Andersson's 2/3-approximate algorithm.Because we find that all approximate solutions of M3S obtained by VNS plus SAGA still are better than that of 0.836-approximate algorithm with the exception of only W30.1 and W30.3.