Model Building and Optimization Analysis of MDF Continuous Hot-Pressing Process by Neural Network

We propose a one-layer neural network for solving a class of constrained optimization problems, which is brought forward from the MDF continuous hot-pressing process. The objective function of the optimization problem is the sum of a nonsmooth convex function and a smooth nonconvex pseudoconvex function, and the feasible set consists of two parts, one is a closed convex subset of R, and the other is defined by a class of smooth convex functions. By the theories of smoothing techniques, projection, penalty function, and regularization term, the proposed network is modeled by a differential equation, which can be implemented easily. Without any other condition, we prove the global existence of the solutions of the proposed neural network with any initial point in the closed convex subset.We show that any accumulation point of the solutions of the proposed neural network is not only a feasible point, but also an optimal solution of the considered optimization problem though the objective function is not convex. Numerical experiments on the MDF hot-pressing process including the model building and parameter optimization are tested based on the real data set, which indicate the good performance of the proposed neural network in applications.


Introduction
Medium density fibreboard (MDF) finds many applications in wood industries because of its favorable properties such as surface characteristics, dimensional stability, and excellent machinability [1,2].In the MDF hot-pressing process many physical processes are involved and the complexity of this operation arises from the fact that they are coupled.Hot-pressing process is one of the key procedures in the production of MDF, which influences the utilization ratio of energy and resource.With the decreased resource of timber and the increased demand of MDF, it is of great important to analyze the experimental data effectively and reasonably, find the main factors among the many indexes of MDF, and establish the relation models on the properties of slab, the parameters in the hot-pressing process, and the main indexes of MDF.These relation models not only can help the staff give reasonable prediction and a reliability assessment to the hot-pressing process according to the actual process parameters, but also provide a theoretical basis for the setting and adjusting of the main factors in hot-pressing process according to the actual demand of MDF properties.So optimization models and methods have been important tools for the optimization, control, and scheduling of the hotpressing of plates.
Real-time online solutions of optimization problems are desired in many engineering and scientific applications.One possible and very promising approach to solve the real-time optimization problems is to apply artificial neural networks [3][4][5].With the resemblance brains, neural networks can be implemented online by hardware and have become an important technical toll for solving optimization problems, for example, [3,4,[6][7][8][9].Based on the gradient method, the Hopfield neural networks proposed in [4,5] are the two classical recurrent neural networks for linear and nonlinear programming, whereafter, in addition to the gradient method, many types of neural networks are designed, such as the Lagrangian neural networks [10], the projectiontype neural networks [11,12], the dual network [13], and the stochastic neural network [14].Projection method is an effective and simple method for solving the constraints.However, it is impossible to solve the general constraints by projection method.Then, Lagrangian and penalty methods are introduced into networks.Based on the Lagrangian function method, Lagrangian networks were proposed for solving the optimization problems [8,10] with general constraints.But the Lagrangian network increases the dimension of the networks along with the number of the constraints.In recent years, recurrent neural networks based on penalty method were widely investigated for solving optimization problems.The neural networks for smooth optimization problems can not solve nonsmooth optimization problems, because the gradients of the objective and constrained functions are required in such neural networks.The generalized nonlinear programming circuit (G-NPC) in [15] can be considered as a natural extension of nonlinear programming circuit (NPC) for solving nonsmooth convex optimization problems with inequality constraints.But the nonempty interior of feasible region and large enough penalty parameters are needed for the network in [15].In order to overcome the nonempty assumption of the interior of feasible region, Bian and Xue [6] proposed a recurrent neural network for nonsmooth convex optimization based on penalty function method.The efficiency of the neural networks for solving convex optimization problems relies on the convexity of functions.A neural network for nonconvex quadratic optimization is presented in [16].Some neural networks modeled by differential inclusion were also proposed for some nonsmooth and nonconvex optimization problems [6,17].To overcome the differential inclusion, smoothing techniques are introduced into the neural networks.The main feature of smoothing method is to approximate the nonsmooth functions by a class of smooth functions.Thus, the neural network constructed by the smoothing techniques is modeled by a differential equation, which can be implemented easily in circuits and mathematical software [18].
In this paper, we propose a neural network model for solving the optimization problem brought forward from the MDF continuous hot-pressing automatic control system.In Section 2, some notations and necessary preliminary results are listed.In Section 3, based on the SVM theory with the existing linear and nonlinear kernel functions, we give an optimization problem, which includes the problems for building the models of MDF continuous hot-pressing system and optimizing the MDF performance indexes as special cases.In order to build up the relation models on the properties of the slab, technical parameters in hot-pressing process, and the performance indexes of MDF, when the kernel function is positive definite or semipositive definite, the corresponding optimization problem is a constrained convex problem; otherwise it is a nonconvex problem.The optimization problem for optimizing the performance parameters is a nonconvex constrained optimization problem, but its objective function is pseudoconvex due to the appropriate choice of kernel functions.In Section 4, we propose a neural network based on the penalty function method, projection method, and smoothing techniques.The proposed network is modeled by a nonautomatic differential equation.By Lyapunov method, we prove that the solution of the proposed network is global existent and convergent to the feasible set of the considered optimization problems.Moreover, due to the pseudoconvexity of the objective function and the convexity of the constraint, the proposed network also converges to the optimal solution set of the optimization problem.In Section 5, based on the existing data set, we use the proposed network into the model building and parameter optimizing problems of hot-pressing system, which validates the good performance of the obtained results in this paper.

Preliminaries
In this section, we state some definitions and properties needed in this paper.We refer the readers to [19][20][21].
2.1.Support Vector Regression.Kernels were regarded as a function with the formulation of inner product and have been a powerful tool in machine learning for their superior performance over a wide range of learning problems, such as isolated hand written digit recognition, text categorization, and face detection [21,22].
Let X be a nonempty set and  : X × X →  be a realvalued and symmetric function.With the kernel matrix K = ((  ,   ))  ,=1 ,  is said to be a positive semidefinite kernel, if K is positive semidefinite for any  ∈ N and  1 ,  2 , . . .,   ∈ X.We call  an indefinite kernel, if there exist  1 ,  2 , . . .,   ∈ X and V,  ∈   such that V  KV < 0 and   K > 0.
In what follows, we list some widely used kernels.

Smoothing Approximation.
Smoothing approximation is an effective method for solving nonsmooth optimization problems and has been widely used in the past decades.The main feature of smoothing method is to approximate the nonsmooth functions by a class of parameterized smooth functions.In this paper, we adopt the smoothing function defined as follows.
Chen and Mangasarian constructed a class of smooth approximations of the function () + fl max{0, } by convolution [20,24] as follows.Let  :  →  + be a piecewise continuous density function satisfying from  ×  + to  + is well defined.By different density functions, many popular smoothing functions of () + can be derived, such as where  1 is the neural networks smoothing function,  2 is called the CHKS (Chen-Harker-Kanzow-Smale) smoothing function,  3 is called the uniform smoothing function, and  4 is called the Picard smoothing function.The four functions belong to the class of the Chen-Mangasarian smoothing functions.
Many nonsmooth functions can be reformulated by using the plus function.We list some of them as follows: So we can define a smoothing function for the above nonsmooth functions by a smoothing function of () + .
From Theorem 9.61 and Corollary 8.47(b) in [20], when  :   →  is locally Lipschitz continuous at , the subdifferential associated with a smoothing function is nonempty and bounded, and () ⊆  g(), where "con" denotes the convex hull.In [20,23], it is shown that many smoothing functions satisfy the gradient consistency which is an important property of the smoothing methods and guarantees the convergence of smoothing methods with adaptive updating schemes of smoothing parameters to a stationary point of the original problem.

Pseudoconvex Function.
Pseudoconvex function is a class of functions, which may be nonsmooth or nonconvex, but brings us the opportunity to find the optimal solutions.
Definition 2 (see [25]).Let X be a nonempty convex subset of   .A function  is said to be pseudoconvex on X if, for any ,  ∈ X, one has Many nonconvex functions in application are pseudoconvex, such as the Butterworth filter function, fraction function, and density function.Of particular interest in this paper is the fact that the Gaussian function with   > 0,  = 1, 2, . . ., , is pseudoconvex on   .

Optimization Problems in MDF Continuous Hot-Pressing Process
In this section, we will give the optimization model considered in this paper.First, by the optimization and support vector machine theories, we show the optimization models for building up the relationships in MDF continuous hotpressing process.Then, another optimization model for optimizing the parameters in the MDF continuous hotpressing process is obtained.Thus, we express these two kinds of problems into a uniform formulation, which is the optimization problem considered in Section 3.

Optimization Problem for Building up the Models of MDF
Continuous Hot-Pressing Process.Denote X ⊂   and Y ⊂  as two sets, where   = (  1 ,   2 , . . .,    ) ∈ X is the attribute vector on behalf of the hot-pressing plate properties;   ∈ Y indicates the values of the qualities of hot-pressing plate.Let  = ( 1 ,  2 , . . .,   ) be the training data set of hot-pressing process, where   = (  ,   ) obeys the unknown distribution and is IID (independent and identically distributed).Based on the support vector machine theory and the training data set, we would like to find a nonlinear function  = () such that it approximates the training data set as much as possible.
Define the penalty function and then {: () ≤ 0} = {: is an optimal solution of (43) if and only if it is an optimal solution of the following problem: where  = ‖‖ + ‖‖/√ +  + 1.Thus, we can build up the relation models of hot-pressing process by solving problem (18).

Optimization Problem for Optimizing the Parameters in MDF Hot-Pressing Process.
Based on the relationships built up in Section 3.1, we focus on the modulus of rupture (MOR), modulus of elasticity (MOE), and internal bonding strength (IBS) of hot-pressing plate by optimizing the process parameters and slab attributes.Suppose the regression functions of MOR, MOS, and IBS with respect to some relative parameters based on the Gaussian radial basis function kernel are and based on the linear polynomial kernel are (i)   is continuously differentiable on   ,  = 1, 2, 3.
And the regression functions in (20) satisfy the following two properties.
From the physical significance of MOR, MOS, and IBS, we suppose that the larger the numbers of MOR, MOS, and IBS, the better the quality of hot-pressing plate.In order to adopt the different demand on the indexes of the hot-pressing plate in different applications, we consider the following two cases in this part.
First, we focus on maximizing a single performance index of the hot-pressing plate when the other two performance indexes are within the certain areas.If we want to optimize the IBS, the corresponding optimization model for this case can be expressed by min −  3 () where ] 1 , ] 2 , ] 3 , ] 4 > 0 indicate the upper bounds of hotpressing temperature, hot-pressing pressure, hot-pressing time, and moisture content, and are the feasible regions of MOR and MOS, respectively.
In order to let problem (21) be solved effectively, we let the objective function  3 be with the Gaussian radial basis function, and the regression functions  1 and  2 in the constraints are with the linear polynomial kernel.Then, (21) which is a pseudoconvex optimization problem with convex constraints.
Second, we would like to optimize the MOR, MOS, and IBS synthetically.For this demand, we consider the following optimization model: where  1 , where ,  ≥ 0, ,  ∈   with  < , Θ :   →  is continuously differentiable and pseudoconvex on   , and   :   →  ( = 1, 2, . . ., ) is continuously differentiable and convex on   .On the one hand, when Θ() := (1/2)   −   ,  fl −,  fl , and  and  are defined as in (18), then problem (24) without   reduces to problem (18).On the other hand, if we let then problem (24) reduces to problem (22).Similar reformulation can be done for problem (23) by (24).Therefore, problem (24) considered in this paper includes the optimization models for building up the relationships and optimizing the relative parameters in MDF continuous hotpressing process.
In what follows, we denote F as the feasible region of (24); that is, and M is the optimal solution set of (24).

Proposed Neural Network.
In this subsection, we propose a one-layer recurrent neural network for solving problem (24), where we combine the penalty function and projection methods to solve the constraints and use the smoothing techniques to overcome the nonsmoothness of the objective function and penalty function.
From the smoothing functions in (25) for the plus function, we define the smoothing function of  as where Form the results in [23], (, ) owns the following properties.
Then, p has the following properties.Lemma 4. p is a smoothing function  and satisfies the following:

Next, by the smoothing function for the absolute value function
we define Since (, ) = (, ) + (−, ), (, ) owns all properties in Lemma 8 and the following results hold.Lemma 5. f is a smoothing function  in (24) with the following properties: From the projected gradient method and the viscosity regularization method, we introduce the following neural network to solve (24): where () =  − and () = 1/( + 1) with  0 > 0. By the definitions for f, (34) can be expressed as where To implement (34) by circuits, we can use the reformulated form of (34) as follows: (37) Equation (34) can be seen as a network with three input and three output variables that are (), (), and ().A simple block structure of the proposed network (37) implemented by circuits is presented in Figure 1.

Theoretical Analysis.
In this subsection, we study some necessary dynamical and optimality properties of proposed network (34) for solving (24).
The global existence of the solutions of (34) is a necessary condition for its usability in optimization.With an initial point  0 ∈ Ω, the solution of (34) is global existent.Moreover, the uniqueness of the solution of (34) with  0 ∈ Ω is proved under some conditions.The proposed network (34) can be implemented in circuits and mathematical software.Then, the feasibility and optimality of network (34) for optimization problem (24) are proved theoretically.
(42) Some Lipschitz condition is often used to guarantee the uniqueness of the solution of a neural network.In what follows, we give a sufficient condition to ensure the uniqueness of the solution of (34) with initial point  0 ∈ Ω. Proposition 7.For any initial point  0 ∈ Ω, if ∇  f(⋅, ) and ∇  p(⋅, ) are locally Lipschitz continuous for any fixed  ∈ (0, 1], then the solution of neural network (34) is unique.
Lyapunov method is employed to analyze the performance of (34).Here, we introduce the following two Lyapunov energy functions: The above two Lyapunov functions satisfy the following estimations along the solutions of (34).

Lemma 8. (i)
The derivative of (, ) along the solution of (34) can be calculated by    ( Similar to the analysis in (i), we obtain the estimation in (ii).
Next, we prove the efficiency of proposed network (34) for solving optimization problem (24), where the convergence feasibility of the proposed network is a basic property.
In what follows, we will prove that lim →+∞ p((), ()) = 0. Arguing by contradiction, we assume that lim Integrating the above inequality from  3 to  (>  3 ), we have Thus, which leads to a contradiction with (, ) ≥ 0 for all  ∈   and  ∈ [0, +∞).Therefore, which guarantees that The following theorem indicates that any accumulation point of the solutions of (34) is just an optimal solution of (34).
For  ∈ , similar to the analysis in Case 1, we have lim

Numerical Experiments
In this section, we test the proposed neural network (34) for solving problem (24), which is brought forward from the MDF continuous hot-pressing process.Based on the existing data set, we use the established theories and proposed neural network (34) to build the relationships between the main qualities of the hot-pressing plate and some relative technology parameters from optimization problem (18).Then, based on optimization problem ( 22), we will use proposed network (34) to solve the optimal values of the technology parameters in hot-pressing system for optimizing the qualities of the hot-pressing plate.All these numerical experiments validate the good performance of the proposed network in this paper.
The numerical testing was carried out on a Lenovo PC (3.00 GHz, 2.00 GB of RAM) with the use of Matlab 7.4.And we use ode23 to realize the neural network (34) in Matlab.

Construction Relation Models in MDF Continuous
Hot-Pressing Process.In this part, by considered optimization problem (18) and network (34), we build the relation model which takes the hot-pressing temperature (TE), hot-pressing pressure (PR), hot-pressing time (TI), and moisture content (MC) of slab as the argument variables and the MOR, MOE, and IBS indexes of MDF as the dependent variables.The numerical results show the good fitting of the built models for the data set, where the data set is given in Table 4.
In order to use the data in Table 4, we first normalize them into [0, 1].And we use the mean square error (MSE), degree of fitting (DF), and sufficient evaluation (SE) to evaluate the numerical results, where ), where  and  * indicate the actual value and predicted value and  and  are defined as in (18).The smaller the MSE and the SE and the closer the DF to 1, the better the regression result.Moreover, the SE function is the objective function in (15).Based on the Gaussian radial basis function kernel, the values of the initial parameters in problem (18) are given in Table 1.With a random initial point  0 ∈ Ω, the numerical results with respect to the obtained solution are also listed in Table 1.Figures 2-4 illustrate the fitting effect of the MOR, MOE, and IBS values by using the proposed network (34) for solving problem (18).And the parameters of the regression functions in (19) are shown in Table 5.Moreover, the regression functions based on the linear polynomial kernel in (20) are also calculated by network (34), where the parameters are also shown in Table 5.

Optimization of Parameters in MDF Continuous
Hot-Pressing Process.In this subsection, we consider two classes of parameter optimizing problems, one aims at optimizing one performance index of MDF, and the other is for comprehensively optimizing the three performance indexes of MDF.Based on the obtained relation models for a particular slab and hot-pressing system, we give a suggestion on the setting of hot-pressing pressure, hot-pressing temperature, hot-pressing time, and moisture content of slab to let the MDF meet the given requirements.We refer to the current standard of MDF indoor plate in China (GB/T11718-1999), which is given in Table 2.

Case 1
Case 1.When the MOR and MOE of the hot-pressing plate are in the certain regions, we would like to maximize the IBS by control of the hot-pressing pressure, hot-pressing temperature, hot-pressing time, and moisture content of slab.Then, Ω = { ∈  4 : 0 ≤ By the proposed network (34) for solving (21), we obtain the optimal solution  * = (159.4725,3.1022, 6.4866, 6.0061)  .
(103)   This means that when we let the hot-pressing temperature be 159.4725∘ C, hot-pressing pressure be 3.1022 Mpa, hotpressing time be 6.4866 min, and moisture content of slab be 6.0061%, we can maximize the IBS of the hot-pressing plate and let the MOR and MOE of it be in the certain regions, where the three performance indexes are shown in Table 3.

Case 2
Case 2. We would like to maximize the MOR, MOE, and IBS synthetically by control of the hot-pressing pressure, hot-pressing temperature, hot-pressing time, and moisture content of slab.
In this case, we use optimization problem (23) with

Figure 2 :
Figure 2: Normalized sample fitting results of network (34) for the regression of MOR.

Figure 3 :
Figure 3: Normalized sample fitting results of network (34) for the regression of MOE.

Figure 4 :
Figure 4: Normalized sample fitting results of network (34) for the regression of IBS.
(23), and  * 3 are the expected values of MOR, MOS, and IBS.In particular, if MOR, MOS, and IBS are with the same importance in the quality of the hot-pressing plate, we can let  1 =  2 =  3 = 1/3, and we can let  1 = 1/2,  2 = 1/3, and  3 = 1/6, if the importance of MOR, MOS, and IBS is strictly monotone decreasing.Similar to the kernel functions in problem(21), we let  1 ,  2 , and  3 in problem(23)be with the Gaussian radial basis function, which means that problem (23) is also a pseudoconvex optimization problem with convex constraints.

Table 1 :
Experimental data set in MDF hot-pressing.

Table 2 :
Criterion of physical and mechanical performance indexes.

Table 3 :
Case 1: performance index values of obtained hot-pressing plate.

Table 4 :
Experimental data set in MDF hot-pressing.

Table 5 :
Parameter values in the regression functions.