MPE Mathematical Problems in Engineering 1563-5147 1024-123X Hindawi Publishing Corporation 10.1155/2014/563860 563860 Research Article Global Minimization of Nonsmooth Constrained Global Optimization with Filled Function Wang Wei-xiang 1 Shang You-lin 2 Zhang Ying 3 Preidikman Sergio 1 Department of Mathematics Shanghai Second Polytechnic University Shanghai 201209 China sspu.cn 2 Department of Mathematics Henan University of Science and Technology Luoyang 471003 China haust.edu.cn 3 Department of Mathematics Zhejiang Normal University Jinhua 321004 China zjnu.edu.cn 2014 2592014 2014 17 05 2014 29 08 2014 25 9 2014 2014 Copyright © 2014 Wei-xiang Wang et al. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

A novel filled function is constructed to locate a global optimizer or an approximate global optimizer of smooth or nonsmooth constrained global minimization problems. The constructed filled function contains only one parameter which can be easily adjusted during the minimization. The theoretical properties of the filled function are discussed and a corresponding solution algorithm is proposed. The solution algorithm comprises two phases: local minimization and filling. The first phase minimizes the original problem and obtains one of its local optimizers, while the second phase minimizes the constructed filled function and identifies a better initial point for the first phase. Some preliminary numerical results are also reported.

1. Introduction

Science and economics rely on the increasing demand for locating the global optimization optimizer, and therefore global optimization has become one of the most attractive research areas in optimization. However, the existence of multiple local minimizers that differ from the global solution confronts us with two difficult issues, that is, how to escape from a local minimizer to a smaller one and how to verify that the current minimizer is a global one. These two issues make most of global optimization problems unsolvable directly by classical local optimization algorithms. Up to now, various kinds of new theories and algorithms on global optimization have been presented . In general, global optimization methods can be divided into two categories: stochastic methods and deterministic methods. The stochastic methods are usually probability based approaches, such as genetic algorithm and simulated annealing method. These stochastic methods have their advantages, but their shortages are also obvious, such as being easily trapped in a local optimizer. Deterministic methods such as filled function method , tunneling method , and stretching function method  can, however, often skip from the current local minimizer to a better one.

The filled function approach, initially proposed for smooth optimization by Ge and Qin  and improved in , is one of the effective global optimization approaches. It furnishes us with an efficient way to use any local optimization procedure to solve global optimization problem. The essence of filled function method is to construct a filled function and then to minimize it to obtain a better initial point for the minimization of the original problem. The existing filled function methods are usually only suitable for unconstrained optimization problem. Moreover, it requires the objective function to be continuously differentiable and the number of local minimizers to be finite. But, in practice, optimization problems may be nonsmooth and often have many complicated constraints and the number of local minimizers may also be infinite. To deal with such situation, in this paper, we extend filled function methods to the case of nonsmooth constrained global optimization and propose a new filled function method. The proposed filled function method combines filled function method for unconstrained global optimization with the exterior penalty function method for constrained optimization.

The paper is organized as follows. In Section 2, a new filled function is proposed and its properties are discussed. In Section 3, a corresponding filled function algorithm is designed and numerical experiments are performed. Finally, in Section 4, some conclusive remarks are given.

In the rest of this paper, the generalized gradient of a nonsmooth function f(x) at the point xX is denoted by f(x) and the generalized directional derivative of f(x) in the direction d at the point x is denoted by f0(x;d). The interior, the boundary, and the closure of the set S are denoted by intS,S, and clS, respectively.

2. A New One-Parameter Filled Function and Its Properties 2.1. Problem Formulation

Consider the following nonsmooth constrained global optimization problem (P): (1)minxSf(x), where S={xXgi(x)0,iI}, XRn is a box set, f(x) and gi(x),iI, are Lipschitz continuous with constants Lf and Lgi,iI, respectively, and I={1,,m} is an index set. For simplicity, the set of local minimizers for problem (P) is denoted by L(P) and the set of the global minimizers is denoted by G(P).

To proceed, we assume that the number of minimizers of problem (P) is infinite, but the number of different function values at the minimizers is finite.

Definition 1.

A function P(x,x*) is called a filled function of f(x) at a local minimizer x* if it satisfies the following:

x* is a strictly local maximizer of P(x,x*) on X;

for any xS1x* or xXS one has 0P(x,x*), where S1={xSf(x)f(x*);

if S2={xSf(x)<f(x*)} is not empty, then there exists a point x2S2 such that x2 is a local minimizer of P(x,x*).

Definition 1 guarantees that when any local search procedure for unconstrained optimization is used to minimize the constructed filled function, the sequences of iterative point will not stop at any point at which the objective function value is larger than f(x*). If x* is not a global minimizer, then a point x- with f(x-)<f(x*) could be identified in the process of the minimization of P(x,x*). Then, we can get a better local minimizer of f(x) by using x- as an initial point. By repeating these two steps, we could finally obtain a global minimizer or an approximate global minimizer of the original problem.

2.2. A New Filled Function and Its Properties

We propose in this section a one-parameter filled function as follows: (2)P(x,x*,q)=-1q[f(x)-f(x*)+i=1mmax(0,gi(x))]2-arg(1+x-x*2)+q[min(0,max(f(x)-f(x*),gi(x),iI))]3, where q>0 is a parameter and x* is the current local minimizer of f(x).

Theorems 24 show that when parameter q>0 is suitably large, the function P(x,x*,q) is a filled function.

Theorem 2.

Assume that x*L(P); then x* is a strictly local maximizer of P(x,x*,q).

Proof.

Since x*L(P), there is a neighborhood O(x*,δ) of x* with δ>0 such that f(x)f(x*) and gi(x)0,iI, for all xO(x*,δ)S. We consider the following two cases.

Case  1. For all xO(x*,δ)S and xx*, we have min(0,max(f(x)-f(x*),gi(x),iI))=0, and so (3)P(x,x*,q)=-1q(f(x)-f(x*))-arg(1+x-x*2)<-arg1=P(x*,x*,q).

Case  2. For all xO(x*,δ)(XS), there exists at least one index i0I such that gi0(x)0, it follows that (4)min(0,max(f(x)-f(x*),gi(x),iI))=0. Thus (5)P(x,x*,q)=-1q[f(x)-f(x*)+i=1mmax(0,gi(x))]2-arg(1+x-x*2)<-arg1=P(x*,x*,q).

The above discussion indicates that x* is a strictly local maximizer of P(x,x*,q).

Theorem 3.

Assuming that x*L(P), then, for any xS1,xx* or xXS, it holds that 0P(x,x1*,q) when q>0 is big enough.

Proof.

For any xS1,xx*, or xXS, we have min(0,max(f(x)-f(x*),gi(x),iI))=0, and so (6)P(x,x*,q)=-1q[f(x)-f(x*)+i=1mmax(0,gi(x))]2-arg(1+x-x*2). Thus (7)P(x,x*,q)-1q[f(x)-f(x*)+i=1mmax(0,gi(x))]×(f(x)+i=1mλigi(x))-2(x-x*)1+(1+x-x*2)2, where 0λi1,iI. Therefore, when q>0 is big enough, it holds that (8)P(x,x*,q),x-x*x-x*1q[LD+i=1mmaxxXgi(x)][Lf+i=1mLgi]-2x-x*1+(1+x-x*2)2<0, which implies that 0P(x,x*,q).

Theorem 4.

Assume that x*L(P) but x*G(P) and clintS=clS. If q>0 is suitably large, then there exists a point x0S such that x0 is a local minimizer of P(x,x*,q).

Proof.

By the conditions, there exists an x2intS such that f(x2)<f(x*),gi(x2)<0,iI.

Then, for any xS, (9)P(x,x*,q)=-1q[f(x)-f(x*)+i=1mmax(0,gi(x))]2-arg(1+x-x*2);P(x2,x*,q)=-1q[f(x2)-f(x*)]2-arg(1+x2-x*2)+q[max(f(x2)-f(x*),gi(x2),iI)]3. Since, as q+, (10)P(x,x*,q)-π2,P(x2,x*,q)-, then when q>0 is suitably big, we have (11)P(x2,x*,q)<P(x,x*,q),xS. Assume that the function P(x,x*,q) reaches its global minimizer over S at x0. Since SS is an open set, then x0SS, and it holds that (12)minxSP(x,x*,q)=minxSSP(x,x*,q)=P(x0,x*,q)P(x2,x*,q). In the following, we will prove that (13)f(x0)<f(x*),gi(x0)<0,iI, which leads to the result.

The proof is by contradiction. Suppose that (14)f(x0)f(x*),gi(x0)<0,iI. Then, as q+, (15)P(x0,x*,q)-arg(1+x0-x*2)>-, which implies that when q>0 is suitably big, we have (16)P(x0,x*,q)>P(x2,x*,q). This is a contradiction.

3. Algorithm and Numerical Examples

Based on the properties of the proposed filled function, we now give a corresponding filled function algorithm as follows.

3.1. Filled Function Algorithm FFAM

Initialization Step

Set a disturbance constant α=0.1.

Select an upper bound of q denoted by qu and set qu:=108.

Select directions ek,k=1,2,,l, with integer l=2n, where n is the number of variables.

Set k=1.

Main Step

Start from an initial point x; minimize the problem (P) by implementing a nonsmooth local search procedure and obtain the first local minimizer x1* of f(x).

Let q=1.

Construct a filled function at x1*: (17)P(x,x1*,q)=-1q[f(x)-f(x1*)+i=1mmax(0,gi(x))]2-arg(1+x-x1*2)+q[min(0,max(f(x)-f(x1*),gi(x),iI))]3.

If k>l, then go to (7). Else, set x=x1*+αek as an initial point, minimize P(x,x*,q) by implementing a nonsmooth local search algorithm, and obtain a local minimizer xk.

If xk-S, then set k=k+1 and go to (4). Else, go to (6).

If xk meets f(xk)<f(x1*), then set x=xk and k=1. Start from x as a new initial point, minimize the problem (P) by using a local search algorithm, and obtain another local minimizer x2* of f(x) with f(x2*)<f(x1*). Set x1*=x2* and go to (2). Else, go to (7).

Increase q by setting q=10q.

If qqu, then set k=1 and go to (3). Else, the algorithm stops and x1* is taken as a global minimizer of the problem (P).

At the end of this section, we make a few remarks below.

(1) Algorithm FFAM is comprised of two stages: local minimization and filling. In stage 1, a local minimizer x1* of f(x) is identified. In stage 2, filled function P(x,x1*,q) is constructed and then minimized. Stage 2 terminates when one point xkS2 is located. Then, algorithm FFAM reenters into stage 1, with xk as an initial point to find a new minimizer x2* of f(x) (if such one minimizer exists), and so on. The above process is repeated until some certain specified stopping criteria are met, and then the last local minimizer is regarded as a global minimizer.

(2) The motivation and mechanism behind the algorithm FFAM are given below.

In Step (3) of the Initialization Step, we can choose directions ek,k=1,2,,l, as positive and negative unit coordinate vectors. For example, when n=2, the directions can be chosen as (1,0),(0,1),(-1,0), and (0,-1).

In Step (1) and Step (6) of the Main Step, we can obtain a local optimizer of the problem (P) by using any nonsmooth constrained local optimization procedure such as Bundle methods and Powell’s method. In Step (4) of the Main Step, we can minimize the proposed filled function by using Hybrid Hooke and Jeeves-Direct Method for Nonsmooth Optimization , Mesh Adaptive Direct Search Algorithms for Constrained Optimization , and so forth.

(3) The proposed filled function algorithm can also be applied to smooth constrained optimization. Any smooth local minimization procedure in the minimization phase can be used, such as conjugate gradient method and quasi-Newton method.

3.2. Numerical Examples

We perform the numerical tests for three examples. All the numerical tests are programmed in Fortran 95. In nonsmooth case, we search for the local minimizers by using Hybrid Hooke and Jeeves-Direct Method for Nonsmooth Optimization  and the Mesh Adaptive Direct Search Algorithms for Constrained Optimization . In smooth case, we use penalty function method and conjugate gradient method to get the local minimizers.

The following are the three examples and their numerical results. And the symbols used in the tables are explained below:

k: the iteration number in finding the kth local minimizer,

xk: the kth new initial point in finding the kth local minimizer,

xk*: the kth local minimizer,

f(xk*): the function value of the kth local minimizer.

Problem 1.

Consider (18)minf(x)=-x12+x22+x32-x1s.t.x12+x22+x32-40hmin{x2-x3,x3}0. Algorithm FFAM succeeds in finding an approximate global minimizer x*=(1.9889,-0.0001,-0.0111)T with f(x*)=-5.9477. The computational results are given in Table 1.

Computational results with initial point (−1, −1, −1).

k x k x k * f ( x k * )
1 (−1, −1, −1) (1.9802, 0.0132, 0.0006) 1.9410
2 (1.1931, 0.6332, −1.1932) (1.9889, −0.0001, −0.0111) 5.9477
Problem 2.

Consider (19)minf(x)=-20exp(-0.2|x1|+|x2|2)-exp(cos(2πx1)+cos(2πx2)2)+20s.t.x12+x22300h2x1+x24-30xi30,i=1,2. Algorithm FFAM succeeds in finding a global minimizer x*=(0,0)T with f(x*)=-2.7183. The computational results are listed in Table 2.

Computational results with initial point (−1, −1).

k x k x k * f ( x k * )
1 (−1, −1) (−15.0000, 0.0000) 5.7164
2 (−1.0585, 0.5165) (0.0001, −0.2094) −0.3690
3 (0.0007, −0.0435) (0.0000, 0.0000) −2.7183
Problem 3.

Consider (20)minf(x)=-x1-x2s.t.x22x14-8x13+8x12+2hx24x14-32x13+88x12-96x1+36h0x13h0x24. Algorithm FFAM succeeds in finding a global minimizer x*=(2.3295,3.1780)T with f(x*)=-5.5081. This problem is taken from . We give this numerical example here to illustrate that algorithm FFAM is also suitable for smooth constrained global optimization. The computational results are given in Table 3.

Computational results with initial point (0, 0).

k x k x k * f ( x k * )
1 (0, 0) (0.6116, 3.4423) −4.0541
2 (2.1653, 2.2546) (2.3295, 3.1780) −5.5081
4. Conclusion

In this paper, we present a new filled function for both nonsmooth and smooth constrained global optimization and investigate its properties. The filled function contains only one parameter which can be readily adjusted in the process of minimization. We also design a corresponding filled function algorithm. Moreover, in order to demonstrate the performance of the proposed filled function method, we make three numerical tests. The preliminary computational results show that the proposed filled function approach is promising.

Conflict of Interests

The authors declare that there is no conflict of interests regarding the publication of this paper.

Acknowledgments

This work was supported by the NNSF of China (nos. 11471102 and 11001248), the SEDF under Grant no. 12YZ178, the Key Discipline “Applied Mathematics” of SSPU under no. A30XK1322100, and NNSF of Zhejiang (no. LY13A010006).

Pardalos P. M. Romeijn H. E. Handbook of Global Optimization: Volume 22: Heuristic Approaches 2002 Dordrecht, The Netherlands Kluwer Academic Publishers Pardalos P. M. Romeijn H. E. Handbook of Global Optimization 2002 62 Dordrecht, The Netherlands Kluwer Academic Publishers Nonconvex Optimization and Its Applications Horst R. Pardalos P. M. Handbook of Global Optimization 1995 2 Dordrecht, The Netherlands Kluwer Academic Publishers Nonconvex Optimization and Its Applications Ge R. P. Qin Y. F. A class of filled functions for finding global minimizers of a function of several variables Journal of Optimization Theory and Applications 1987 54 2 241 252 10.1007/BF00939433 MR895737 ZBL0595.65072 2-s2.0-0023401192 Wang W. Yang Y. Zhang L. Unification of filled function and tunnelling function in global optimization Acta Mathematicae Applicatae Sinica: English Series 2007 23 1 59 66 10.1007/s10255-006-0349-9 MR2300146 2-s2.0-33845757521 Xu Z. Huang H. Pardalos P. M. Xu C. Filled functions for unconstrained global optimization Journal of Global Optimization 2001 20 1 49 65 10.1023/A:1011207512894 MR1836883 2-s2.0-0035331481 Yang Y.-J. Shang Y.-L. A new filled function method for unconstrained global optimization Applied Mathematics and Computation 2006 173 1 501 512 10.1016/j.amc.2005.04.046 MR2203404 2-s2.0-32144463495 Levy A. V. Montalvo A. The tunneling algorithm for the global minimization of functions SIAM Journal on Scientific and Statistical Computing 1985 6 1 15 29 10.1137/0906002 MR773277 Wang Y. Zhang J. A new constructing auxiliary function method for global optimization Mathematical and Computer Modelling 2008 47 11-12 1396 1410 10.1016/j.mcm.2007.08.007 MR2428337 2-s2.0-43049112498 Price C. J. Robertson B. L. Reale M. A hybrid Hooke and Jeeves—direct method for non-smooth optimization Advanced Modeling and Optimization 2009 11 1 43 61 MR2476519 Audet C. Dennis J. Mesh adaptive direct search algorithms for constrained optimization SIAM Journal on Optimization 2006 17 1 188 217 10.1137/040603371 MR2219150 2-s2.0-33750265086 Floudas C. A. Pardalos P. M. A Collection of Test Problems for Constrained Global Optimization Algorithms 1990 Berlin , Germany Springer MR1075414