On the Relationship Between Regression Analysis and Mathematical Programming

The interaction between linear, quadratic programming and regression analysis are explored by both statistical and operations research methods. Estimation and optimization problems are formulated in two different ways: on one hand linear and quadratic programming problems are formulated and solved by statistical methods, and on the other hand the solution of the linear regression model with constraints makes use of the simplex methods of linear or quadratic programming. Examples are given to illustrate the ideas.


Introduction
We will discuss the interaction between linear, quadratic programming and regression analysis.These interactions are considered both from a statistical point of view and from an optimization point of view.We also examine the algorithms established by both statistical and operations research methods.Minimizing the sum of the absolute values of the regression has shown that it can be reduced to a general linear programming problem (Wager, 1969) and Wolfe (1959) hinted that his method can be applied to regression but no analysis is done.Estimation and optimization problems are formulated in two different ways: on one hand linear and quadratic

Regression models
Consider the linear regression (LR) model with nonnegative constraints where Y ∈ R n represents the vector of responses, X is an n × p design matrix, β ∈ R p represents the unknown parameters of the model and β ≥ 0 means that all the elements in the vector are non-negative, and represents the random error term of the LR model.
A general linear regression model (1) with inequality constraints (LRWC) and nonnegative variables is given as follows: where β ∈ R p is the unknown vector; X n×p (n ≥ p) and A s×p (s ≤ p) are constant matrices, Y n×1 , C s×1 , and ε n×1 are column vectors, ε ∼ N (0, σ 2 I) ; X T X ≥ 0 and rank(A) = s.The solution of LRWC is a subject of the Karush-Kuhn-Tucker theorem.The subsequent algorithms for solving LRWC are discussed in Lawson and Hanson (1974) and Whittle (1971).

Mathematical programming
Due to the fact that max{f (x)} = min{−f (x)}, we will focus our attention only on minimization problems.A primal linear programming (LP) problem with nonnegative solution can be formulated as follows: where β is the unknown vector, 0 = b T ∈ R p , f T ∈ R m and G m×p are known constant vectors and a matrix, respectively.
A quadratic programming (QP) problem in which all the variables must be nonnegative is formulated as follows: where A s×p , (s ≤ p), D p×p are matrices; C s×1 , β p×1 and b p×1 are column vectors, rank (A) = s ≤ p and D is symmetric and positive definite matrix.
In the next section, we further explore the relationships between the above four models.The aim in this note is to provide a strong link and algorithms between these concepts.In fact they are equivalent in some cases.Parameter estimates of models (1) and ( 2) are obtained by the simplex method and the Karush-Kuhn-Tucker theorem.The optimization problems of models (3) and ( 4) are restated and solved by statistical methods.
The associated system of normal equations is given as follows: x i y i = 0.
Therefore, model ( 1) is equivalent to the following mathematical programming problem: We will use the phase I in the two-phase version of the simplex method to solve (5).The problem to be solved by phase I is where R 1 and R 2 are artificial variables.The optimization problem ( 6) can be rewritten as The initial values are summarized in the Table 1.
Bearing in mind that the solution of the LR problem is actually the solution of the corresponding system of normal equations, it is now easy to see that problem (7) is equivalent to solving related the LR problem (1).Hence we can obtain the optimal solution for model (1) by using the simplex method for problem (7) with the initial values in Table 1.The above approach can be applied for solving linear regression model (1) with p > 2, i.e., Next, we will illustrate the above ideas by an example.From the given data set, we obtain: x 2 i = 80, 199 and n = 18.Hence, Table 1 becomes Using the simplex method, the optimal table is obtained in two iterations as: Therefore, β0 = 10.727,β1 = 0.873 and a fitted linear regression line is given by ŷ = 10.727+ 0.873x.

Solving the Least Squares Problem With Constraints Using NLP Methods
Using the least squares method to model ( 2) we obtain a general regression problem: Problem (9) is a simultaneous quadratic programming problem and thus it can be solved by using Wolfe's method based on Karush-Kuhn-Tucker conditions.Rewriting problem (9) as a quadratic programming problem leads to: where, as in the previous example, a = i .Example 3.1: Use the given set of data to evaluate the parameters of a simple linear regression model with additional restrictions imposed on the parameters of the model, i.e., Let us solve the same problem by employing nonlinear programming ideas.Firstly, we have to rewrite the problem in the form of quadratic programming by using the previously calculated values of b 1 , b 2 , c 1 and c 2 .We have: Then, solving the above quadratic programming problem with Wolfe's method confirms the optimal values of β 0 = 39.6484 and β 1 = 580.151.

Solving QP Problem Using the Least Squares Method
The relationship between the quadratic programming (4) and the least squares method (9) is studied by Wang, Chukova and Lai (2003), Theorem 1: The relationship of QP (4) and LS ( 9) is given by where X is a real upper triangular matrix with positive diagonal elements satisfying X T X = D and Y = − 1 2 (X T ) −1 b.Hence, minimizing Z 0 is equivalent to minimizing Z(β).Moreover, when b = 0, we have Let us consider the least squares problem similar to (9) where all the constraints are in the form of equality, i.e Aβ = C.Using the Lagrangian method, we obtain the corresponding normal equation: Theorem 2: Let β * be the solution of (11) and β0 be the solution of linear regression model with no constraints.Then, the relationship between β * and β0 is: where H = A(X T X) −1 A T is a hat matrix (Sen 1990).
Based on Theorem The solution to this QP by using Wolfe's method is found to be Z T = (x 1 , x 2 ) = ( 34 , 5  8 ).Let us apply our algorithm for reducing the above QP to LS.We have The above is a matrix representation of model ( 4).Let and 1. Find the matrices X and Y and convert a QP problem to a LS problem.
0 .866and 3. Verify whether Aβ * ≥ C. In this case both conditions in model ( 4) are not satisfied.Thus we have to solve the following two problems: • First, we solve = −2 and obtain T .

Verify whether
It is easy to check that the constraint Aβ (i) ≥ C , for i = 1, 2 is not satisfied.Hence, we solve for ≥ 0, i = 1, 2, which gives T .
The constraint Aβ (1,2) ≥ C is satisfied.Thus, the the optimal solution is β T .
The above solution confirms the previous result obtained by Wolfe's method.

Conclusions
A linear regression model is solved by two-phase version of the simplex method.A statistical algorithm to solve the quadratic programming problem is proposed.In comparison with the nonlinear programming methods for solving QP, our algorithm has the following advantages: (a) Statistical courses often form the core portion for most degree programs at bachelor level.The algorithm based on basic statistical concepts is easy to understand, learn and apply.
(b) Some of the steps of the algorithm are included as built-in functions or procedures in many of the commonly used software packages like MAPLE, MATHEMATICA and so on.
(c) The algorithm avoids the usage of slack and artificial variables.

Call for Papers
As a multidisciplinary field, financial engineering is becoming increasingly important in today's economic and financial world, especially in areas such as portfolio management, asset valuation and prediction, fraud detection, and credit risk management.For example, in a credit risk context, the recently approved Basel II guidelines advise financial institutions to build comprehensible credit risk models in order to optimize their capital allocation policy.Computational methods are being intensively studied and applied to improve the quality of the financial decisions that need to be made.Until now, computational methods and models are central to the analysis of economic and financial decisions.However, more and more researchers have found that the financial environment is not ruled by mathematical distributions or statistical models.In such situations, some attempts have also been made to develop financial engineering models using intelligent computing approaches.For example, an artificial neural network (ANN) is a nonparametric estimation technique which does not make any distributional assumptions regarding the underlying asset.Instead, ANN approach develops a model using sets of unknown parameters and lets the optimization routine seek the best fitting parameters to obtain the desired results.The main aim of this special issue is not to merely illustrate the superior performance of a new intelligent computational method, but also to demonstrate how it can be used effectively in a financial engineering environment to improve and facilitate financial decision making.In this sense, the submissions should especially address how the results of estimated computational models (e.g., ANN, support vector machines, evolutionary algorithm, and fuzzy models) can be used to develop intelligent, easy-to-use, and/or comprehensible computational systems (e.g., decision support systems, agent-based system, and web-based systems) This special issue will include (but not be limited to) the following topics: • Computational methods: artificial intelligence, neural networks, evolutionary algorithms, fuzzy inference, hybrid learning, ensemble learning, cooperative learning, multiagent learning

1 and Theorem 2 ,
Wang, Chukova and Lai (2003) developed a stepwise algorithm for reducing and solving QP problem (4) 138 D. Q. WANG, S. CHUKOVA AND C. D. LAI with regression analysis.The following example illustrates the algorithm.Example 4.1: Consider Example 2.1:(Rencher, 2000, p113-114)The exam scores y and homework scores x (average value) for 18 students in a statistics class were as follows

•
Application fields: asset valuation and prediction, asset allocation and portfolio selection, bankruptcy prediction, fraud detection, credit risk management • Implementation aspects: decision support systems, expert systems, information systems, intelligent agents, web service, monitoring, deployment, implementation