On the Performance of Principal Component Liu-Type Estimator under the Mean Square Error Criterion

Wu (2013) proposed an estimator, principal component Liu-type estimator, to overcomemulticollinearity.This estimator is a general estimator which includes ordinary least squares estimator, principal component regression estimator, ridge estimator, Liu estimator, Liu-type estimator, r-k class estimator, and r-d class estimator. In this paper, firstly we use a new method to propose the principal component Liu-type estimator; then we study the superior of the new estimator by using the scalar mean squares error criterion. Finally, we give a numerical example to show the theoretical results.


Introduction
Consider the multiple linear regression model where  is an  × 1 vector of observation,  is an  ×  known matrix of rank ,  is a  × 1 vector of unknown parameters, and  is an ×1 vector of disturbances with expectation () = 0 and variance-covariance matrix Cov() =  2   .
According to the Gauss-Markov theorem, the classical ordinary least squares estimator (OLSE) is obtained as follows: The OLSE has been regarded as the best estimator for a long time.However, when multicollinearity is present and the matrix    is ill-conditioned, the OLSE is no longer a good estimator.To improve OLSE, many ways have been proposed.One way is to consider biased estimator, such as, principal component regression estimator [1], ridge estimator [2], Liu estimator [3], Liu-type estimator [4], two-parameter ridge estimator [5], - class estimator [6], - class estimator [7], and modified - class estimator [8].
An alternative method to overcome the multicollinearity is to consider the restrictions.Xu and Yang [9] introduced a stochastic restricted Liu estimator; Li and Yang [10] introduced a stochastic restricted ridge estimator.
To overcome multicollinearity, Hoerl and Kennard [2] solve the following problem: min  {( − ) ( − ) +  (  − )} , where  is a Lagrangian multiplier and  is a constant, and obtain the ridge estimator (RE): Liu [3] introduced the Liu estimator (LE): where β is OLSE.This estimator can be obtained by solving the following problem: This estimator can also be obtained by the following ways.Suppose that  satisfied  β =  + .Then, we use the mixed method [11]; we can obtain the Liu estimator.
where β is OLSE.This estimator can be obtained by solving the following problem: Let us consider the following transformation for the model ( 1): where Λ  and Λ − are diagonal matrices such that that the main diagonal elements of the Baye and Parker [6] introduced the application of ridge approach to improve the PCR estimator, namely, - class estimator as Alternatively, Kac ¸ıranlar and Sakallıoglu [7] introduced the - class estimator which is the combination of the LE and the PCRE, which is defined as follows: Wu [12] proposed the principal component Liu-type estimator (PCTTE), which is defined as In this paper, firstly we use a new method to propose the principal component Liu-type estimator.Then, we show that, under certain conditions, the PCTTE is superior to the related estimator in the mean square error criterion.Finally, we give a numerical example to illustrate the theoretical results.

The Principal Component Liu-Type Estimator
Using the symbols in ( 9) and ( 10), (1) can be written as follows: The PCRE can be obtained by omitted  −  − , and the model (15) reduced to: Then, solve the following problem: we obtain Then, transforming α to the original parameter space, we can get the PCRE of parameter .Now, we give a method to obtain the principal component Liu-type estimator.Let  be a constant and  a Lagrangian multiplier, minimizing where α = (     ) −1    .Then we get After transforming back to original parameters pace, we obtain This estimator can also be got by minimizing the function It is easy to see that the new estimator β (, ) has the following properties: (
Let  =  and  =  in (26); we obtain the MSE of β as follows: In order to compare the β (, ) and β, now we consider the following difference: Since the lower bound of  is less than 1, the lower bound of  may be less than 0. If the lower bound of of  is less than 0, then we can choose any  in [0, 1].Thus, we can get the following theorem.
Then we have the following theorem.

Numerical Example
To illustrate our theoretical results, firstly we use a numerical example to investigate the estimators studied in the dataset originally due to Gruber [13], and later considered by Akdeniz and Erol [14].We assemble the data as follows:  For the OLSE, PCRE, - class estimator, - class estimator, Liu-type estimator (LTE), and new estimator (PCTTE), their estimated mean square error (MSE) values are obtained by replacing all unknown model parameters by their, respectively, least squares estimators in corresponding expressions.Firstly, we see the comparison of the OLSE and the PCTTE.When  = 0.6 is fixed, then if the values of  is big then the new estimator has smaller MSE values than OLSE; that is to say, the new estimator is better than the OLSE.So we see that the new estimator improved the OLSE.
From Figure 3, we see that when , fixed, if 0 <  < , then the new estimator is better than the PCRE.In Theorem 1, we see that if 0 <  < , the new estimator is better.For the numerical example, when 0 <  < , the new estimator is more efficient than the PCRE.
By Figures 1, 2, 3, 4, 5, 6, 7, and 8, we see that when 0 <  < , then new estimator is better than the LTE.So in practice, we can choose bigger  and smaller .

Conclusion
In this paper, we use a new method to propose the principal component Liu-type estimator.Then, we discuss the superiority of the new estimator with the OLSE, PCRE, - class estimator, - class estimator, and Liu-type estimator in the sense of mean square error.We also give a method to choose the biasing parameters.Finally, we give a numerical example to illustrate the performance the various estimators.
Figure 8: The MSE values of LTE and PCTTE when  = 0.3.