^{1}

^{2}

^{3}

^{1}

^{2}

^{3}

Optimization problems with second-order cone constraints (SOCs) can be solved efficiently
by interior point methods. In order for some of these methods to get started or
to converge faster, it is important to have an initial feasible point or near-feasible point.
In this paper, we study and apply Chinneck's

We consider a system of second-order cone inequalities as follows:

Second-order cones (SOCs) are important to study because there exist many optimization problems where the constraints can be written in this form. For instance, SOCs can be used to easily represent problems dealing with norms, hyperbolic constraints, and robust linear programming. There are also a huge number of real world applications in areas such as antenna array weight design, grasping force optimization, and portfolio optimization (see [

In this paper, we will describe two of Chinneck's constraint consensus algorithms and apply them to SOCs to find near-feasible points. These are the

We also study how our work for SOCs apply to the special case of convex quadratic constraints (CQCs). More information on the importance of CQCs in the field of optimization can be found in [

In this section, we study and apply Chinneck's

The first constraint consensus method, hereby, called

INPUTS: set of

a movement tolerance

Calculate feasibility vector

exit successfully

exit unsuccessfully

The feasibility vectors are then combined to create a single

Ibrahim et al. gave several modifications of the

INPUTS: set of

a movement tolerance

Calculate feasibility vector

exit successfully

exit unsuccessfully

For a more specific example, we consider a system of 3 SOCs in

The feasible region

A system of

(1)

(2)

(3)

where

The contours and graph of

The following theorem is well known. An elementary proof can be given using the triangle inequality as shown. It will be used to discuss the convergence of the

For convenience, we will consider two separate functions,

By the triangle inequality, we have that

Projection algorithms such as the

The main task in adapting the consensus methods to SOCs is computing the gradient

When calculating the gradient, there exist two potential problems. First of all, there are times when the gradient may fail to exist. When

The gradient for cone (1) of Example

We know from Theorem

Suppose

If the gradient is zero,

As a consequence of this result, for strictly concave constraints, the gradient will only be zero inside of the feasible region, and so the feasibility vector exists at all iterates of our algorithms.

Let

The following results show that in each iteration of the algorithm, we move closer to the boundary of the feasible region.

Suppose the line

For concave

The gradient vector field of

Gradient field of cone (2) of Example

In this section, we study how the discussions in the previous section apply to the special case of convex quadratic constraints (CQCs). This study will be limited to results that are different and relevant to CQCs.

We consider a system of CQCs:

For ease of presentation, consider the single CQC

A convex quadratic constraint (CQC) with

For Example

The solution for the system (

Since the Hessian

In this section, we propose a modification to the

The

INPUTS: set of

Compute the binary word

count = 0

Compute the binary word

count = count+1

The Backtracking Line Search Technique uses the concept of a

Define

If

We start by computing the consensus vector

In Figure

Comparison of

In this section, we use numerical experiments to test the

For each SOC test problem, we first generate integer values

Test problems.

Problem | SOCs | CQCs | ||||

1 | 10 | 9 | 7 | 9 | 7 | 7 |

2 | 6 | 2 | 48 | 3 | 9 | 15 |

3 | 3 | 8 | 15 | 7 | 3 | 25 |

4 | 8 | 1 | 37 | 4 | 9 | 28 |

5 | 7 | 3 | 27 | 10 | 4 | 8 |

6 | 3 | 7 | 50 | 2 | 10 | 41 |

7 | 3 | 8 | 5 | 5 | 8 | 47 |

8 | 9 | 7 | 7 | 8 | 4 | 34 |

9 | 7 | 5 | 20 | 6 | 6 | 17 |

10 | 10 | 7 | 37 | 2 | 8 | 44 |

11 | 9 | 2 | 36 | 8 | 2 | 19 |

12 | 10 | 4 | 42 | 2 | 4 | 16 |

13 | 5 | 3 | 36 | 2 | 6 | 3 |

14 | 3 | 10 | 6 | 10 | 9 | 46 |

15 | 9 | 8 | 6 | 5 | 5 | 10 |

16 | 8 | 7 | 1 | 5 | 7 | 2 |

17 | 6 | 6 | 5 | 7 | 9 | 25 |

18 | 10 | 9 | 43 | 7 | 10 | 48 |

19 | 8 | 9 | 30 | 8 | 2 | 41 |

20 | 10 | 5 | 42 | 10 | 3 | 41 |

21 | 7 | 9 | 37 | 7 | 1 | 32 |

22 | 6 | 2 | 37 | 3 | 3 | 48 |

23 | 9 | 2 | 47 | 2 | 9 | 24 |

24 | 4 | 9 | 10 | 2 | 4 | 1 |

25 | 10 | 7 | 26 | 4 | 2 | 25 |

For each test problem and method, we chose a random infeasible point ^{©} 7.9.0 and ran on Dell OptiPlex GX280.

Tables

Success rates and average times on SOCs.

Algorithm | Success rate | Average time (sec) |
---|---|---|

0 | 0.131 | |

64 | 0.075 | |

4 | 0.055 | |

84 | 0.033 |

SOCs: comparison of

Problem | ||||||

Iter | Time (sec) | Interior feas Pt? | Iter | Time (sec) | Interior feas Pt? | |

1 | 34 | 0.055 | N | 4 | 0.023 | Y |

2 | 83 | 0.194 | N | 33 | 0.162 | N |

3 | 121 | 0.094 | N | 51 | 0.086 | Y |

4 | 53 | 0.117 | N | 14 | 0.059 | Y |

5 | 98 | 0.135 | N | 30 | 0.085 | N |

6 | 500 | 1.195 | N | 500 | 2.506 | N |

7 | 40 | 0.013 | N | 4 | 0.003 | Y |

8 | 81 | 0.034 | N | 24 | 0.021 | N |

9 | 78 | 0.082 | N | 32 | 0.071 | N |

10 | 42 | 0.080 | N | 15 | 0.059 | Y |

11 | 140 | 0.252 | N | 61 | 0.228 | N |

12 | 127 | 0.269 | N | 60 | 0.258 | N |

13 | 500 | 0.870 | N | 500 | 1.842 | N |

14 | 7 | 0.003 | N | 3 | 0.003 | Y |

15 | 23 | 0.009 | N | 4 | 0.003 | Y |

16 | 5 | 0.001 | N | 3 | 0.001 | Y |

17 | 12 | 0.004 | N | 3 | 0.002 | Y |

18 | 85 | 0.189 | N | 5 | 0.025 | Y |

19 | 95 | 0.148 | N | 10 | 0.033 | Y |

20 | 66 | 0.139 | N | 22 | 0.096 | Y |

21 | 43 | 0.083 | N | 32 | 0.126 | Y |

22 | 339 | 0.610 | N | 10 | 0.040 | Y |

23 | 197 | 0.448 | N | 69 | 0.327 | N |

24 | 10 | 0.006 | N | 4 | 0.005 | Y |

25 | 27 | 0.037 | N | 7 | 0.020 | Y |

SOCs: comparison of

Problem | ||||||

Iter | Time (sec) | Interior feas Pt? | Iter | Time (sec) | Interior feas Pt? | |

1 | 16 | 0.014 | N | 5 | 0.014 | Y |

2 | 145 | 0.332 | N | 16 | 0.100 | Y |

3 | 101 | 0.077 | N | 30 | 0.061 | Y |

4 | 14 | 0.030 | N | 5 | 0.025 | Y |

5 | 25 | 0.035 | N | 6 | 0.021 | Y |

6 | 500 | 1.196 | N | 500 | 3.313 | N |

7 | 6 | 0.002 | N | 7 | 0.005 | Y |

8 | 52 | 0.022 | N | 8 | 0.008 | Y |

9 | 13 | 0.014 | N | 18 | 0.040 | Y |

10 | 20 | 0.038 | N | 8 | 0.038 | Y |

11 | 35 | 0.063 | N | 8 | 0.036 | Y |

12 | 87 | 0.179 | N | 17 | 0.091 | Y |

13 | 500 | 0.866 | N | 500 | 2.408 | N |

14 | 3 | 0.001 | N | 3 | 0.003 | Y |

15 | 5 | 0.002 | N | 4 | 0.003 | Y |

16 | 5 | 0.001 | N | 3 | 0.001 | Y |

17 | 14 | 0.004 | N | 4 | 0.003 | Y |

18 | 37 | 0.082 | N | 9 | 0.055 | N |

19 | 35 | 0.054 | N | 9 | 0.039 | N |

20 | 87 | 0.181 | N | 12 | 0.065 | Y |

21 | 8 | 0.016 | N | 7 | 0.036 | Y |

22 | 5 | 0.010 | N | 5 | 0.023 | Y |

23 | 31 | 0.072 | N | 10 | 0.057 | Y |

24 | 4 | 0.003 | Y | 4 | 0.007 | Y |

25 | 20 | 0.027 | N | 7 | 0.024 | Y |

As expected, on average the

On average the

Tables

Success rates and average times on CQCs.

Algorithm | Success rate | Average time (sec) |
---|---|---|

0 | 0.110 | |

44 | 0.061 | |

0 | 0.041 | |

36 | 0.114 |

CQCs: comparison of the

Problem | ||||||

Iter | Time (sec) | Interior feas Pt? | Iter | Time (sec) | Interior feas Pt? | |

1 | 55 | 0.050 | N | 19 | 0.028 | N |

2 | 16 | 0.022 | N | 7 | 0.016 | Y |

3 | 51 | 0.129 | N | 16 | 0.060 | Y |

4 | 18 | 0.049 | N | 6 | 0.028 | N |

5 | 158 | 0.127 | N | 53 | 0.070 | N |

6 | 16 | 0.060 | N | 9 | 0.056 | Y |

7 | 25 | 0.109 | N | 6 | 0.042 | N |

8 | 48 | 0.151 | N | 14 | 0.072 | N |

9 | 31 | 0.051 | N | 11 | 0.032 | Y |

10 | 15 | 0.061 | N | 8 | 0.052 | Y |

11 | 130 | 0.236 | N | 42 | 0.121 | N |

12 | 19 | 0.028 | N | 6 | 0.015 | Y |

13 | 14 | 0.005 | N | 8 | 0.005 | Y |

14 | 32 | 0.143 | N | 8 | 0.057 | N |

15 | 24 | 0.024 | N | 7 | 0.011 | N |

16 | 21 | 0.005 | N | 11 | 0.005 | Y |

17 | 28 | 0.068 | N | 7 | 0.027 | N |

18 | 26 | 0.119 | N | 6 | 0.043 | N |

19 | 86 | 0.325 | N | 29 | 0.181 | N |

20 | 77 | 0.291 | N | 30 | 0.183 | N |

21 | 135 | 0.451 | N | 50 | 0.286 | N |

22 | 24 | 0.115 | N | 8 | 0.059 | N |

23 | 17 | 0.042 | N | 8 | 0.031 | Y |

24 | 15 | 0.002 | N | 14 | 0.003 | Y |

25 | 39 | 0.101 | N | 12 | 0.050 | Y |

CQCs: comparison of

Problem | ||||||

Iter | Time (sec) | Interior feas Pt? | Iter | Time (sec) | Interior feas Pt? | |

1 | 25 | 0.017 | N | 96 | 0.110 | N |

2 | 11 | 0.015 | N | 253 | 0.601 | N |

3 | 21 | 0.049 | N | 58 | 0.231 | Y |

4 | 13 | 0.035 | N | 500 | 2.071 | N |

5 | 36 | 0.028 | N | 45 | 0.060 | Y |

6 | 11 | 0.042 | N | 27 | 0.170 | N |

7 | 14 | 0.060 | N | 500 | 3.467 | N |

8 | 18 | 0.057 | N | 500 | 2.537 | N |

9 | 16 | 0.026 | N | 500 | 1.317 | N |

10 | 11 | 0.044 | N | 27 | 0.181 | N |

11 | 31 | 0.054 | N | 31 | 0.095 | Y |

12 | 13 | 0.019 | N | 25 | 0.061 | N |

13 | 12 | 0.004 | N | 23 | 0.012 | Y |

14 | 16 | 0.069 | N | 500 | 3.521 | N |

15 | 11 | 0.011 | N | 21 | 0.034 | Y |

16 | 21 | 0.005 | N | 24 | 0.009 | Y |

17 | 15 | 0.035 | N | 500 | 1.895 | N |

18 | 14 | 0.062 | N | 500 | 3.572 | N |

19 | 23 | 0.084 | N | 500 | 3.083 | N |

20 | 27 | 0.100 | N | 500 | 3.049 | N |

21 | 28 | 0.089 | N | 29 | 0.199 | Y |

22 | 12 | 0.057 | N | 500 | 3.711 | N |

23 | 12 | 0.030 | N | 27 | 0.103 | Y |

24 | 15 | 0.002 | N | 14 | 0.004 | Y |

25 | 13 | 0.032 | N | 500 | 2.001 | N |

As can be seen from the tables, applying the backtracking technique to the

We study the Chinneck's

Given a set of SOCs, we adapt the

Before applying backtracking, the method known to reach feasibility in the least amount of time, with the fewest number of iterations was consistently

Overall and considering both SOCs and CQCs, we find the backtracking line search to be most successful in reducing time and iterations needed to reach the feasible region when applied to the

The work of A. Weigandt and K. Tuthill was supported by the Department of Mathematics and Statistics at Northern Arizona University under the REU program of the National Science Foundation in Summer 2010.