Two Classes of Almost Unbiased Type Principal Component Estimators in Linear Regression Model

This paper is concerned with the parameter estimator in linear regression model. To overcome the multicollinearity problem, two new classes of estimators called the almost unbiased ridge-type principal component estimator (AURPCE) and the almost unbiased Liu-type principal component estimator (AULPCE) are proposed, respectively. The mean squared error matrix of the proposed estimators is derived and compared, and some properties of the proposed estimators are also discussed. Finally, a Monte Carlo simulation study is given to illustrate the performance of the proposed estimators.


Introduction
Consider the following multiple linear regression model: where  is an  × 1 vector of responses,  is an  ×  known design matrix of rank ,  is a  × 1 vector of unknown parameters,  is an  × 1 vector of disturbances assumed to be distributed with mean vector 0 and variance covariance matrix  2   , and   is an identity matrix of order .
According to the Gauss-Markov theorem, the ordinary least squares estimate (OLSE) of ( 1) is obtained as follows: It has been treated as the best estimator for a long time.However, many results have proved that the OLSE is no longer a good estimator when the multicollinearity is present.To overcome this problem, many new biased estimators have been proposed, such as principal components regression estimator (PCRE) [1], ridge estimator [2], Liu estimator [3], almost unbiased ridge estimator [4], and the almost unbiased Liu estimator [5].
To hope that the combination of two different estimators might inherit the advantages of both estimators, Kac ¸ıranlar et al. [6] improved Liu's approach and introduced the restricted Liu estimator.Akdeniz and Erol [7] compared some biased estimators in linear regression in the mean squared error matrix (MSEM) sense.By combining the mixed estimator and Liu estimator, Hubert and Wijekoon [8] obtained the two-parameter estimator which is a general estimator including the OLSE, ridge estimator, and Liu estimator.Baye and Parker [9] proposed the  −  class estimator which includes as special cases the PCRE, the RE, and the OLSE.Then, Kac ¸ıranlar and Sakallıoglu [10] proposed the  −  estimator which is a generalization of the OLSE, PCRE, and Liu estimator.Based on the − estimator and − estimator, Xu and Yang [11] considered the restricted  −  estimator and restricted  −  estimator and Wu and Yang [12] introduced the stochastic restricted  −  estimator and the stochastic restricted  −  estimator, respectively.
The primary aim in this paper is to introduce two new classes of estimators where one includes the OLSE, PCRE, and AURE as special cases and the other one includes the OLSE, PCRE, and AULE as special cases and provide some alternative methods to overcome multicollinearity in linear regression.
The paper is organized as follows.In Section 2, the new estimators are introduced.In Section 3, some properties of the new estimator are discussed.Then we give a Monte Carlo simulation in Section 4. Finally, some conclusions are given in Section 5.

The New Estimators
In the linear model given by (1), the almost unbiased ridge estimator (AURE) proposed by Singh et al. [4] and the almost unbiased Liu estimator (AULE) proposed by Akdeniz and Kac ¸ıranlar [5] are defined as respectively, where  > 0, 0 <  < 1,  =   .Now consider the spectral decomposition of the matrix given as where The  −  class estimator proposed by Baye and Parker [9] and the  −  class estimator proposed by Kac ¸ıranlar and Sakallıoglu [10] are defined as Followed by Xu and Yang [11], the − class estimator and  −  class estimator can be rewritten as follows: where β() = (Λ + ) −1      = ( + ) −1    is the ridge estimator by Hoerl and Kennard [2] and β() = (Λ + ) −1 ( + Λ −1 )     = ( + ) −1 ( +  −1 )   is the Liu estimator proposed by Liu [3].Now, we are to propose two new estimator classes by combining the PCRE with the AURE and AULE, that is, the almost unbiased ridge principal components estimator (AURPCE) and the almost unbiased Liu estimator principal component estimator (AULPCE), as follows: respectively, where From the definition of the AURPCE, we can easily obtain the following.
From the definition of the SRAULPCE, we can similarly obtain the following.
Furthermore, we can compute that the bias, dispersion matrix, and mean squared error matrix of the new estimators βAU (, ) are respectively.
In a similar way, we can get the MSEM of the βAU (, ) as follows: In particular, if we let  =  in ( 12) and ( 13), then we can get the MSEM of the AURE and AULE as follows:

Superiority of the Proposed Estimators
For the sake of convenience, we first list some notations, definitions, and lemmas needed in the following discussion.For a matrix ,   ,  + , rank(), (), and () stand for the transpose, Moore-Penrose inverse, rank, column space, and null space, respectively. ≥ 0 means that  is nonnegative definite and symmetric.
Lemma 1.Let  × be the set of  ×  complex matrices, let  × be the subset of  × consisting of Hermitian matrices, and  ∈  × ,  * , M(), and J() stand for the conjugate transpose, the range, and the set of all generalized inverses, respectively.Let  ∈  × ,  1 and  2 ∈  ×1 be linearly independent, .

and only if one of the following sets of conditions holds:
where ( . ..V) is a subunitary matrix ( possibly absent), Δ a positive-definite diagonal matrix (occurring when  is present), and  a positive scalar.Further, all expressions in (a), (b), and (c) are independent of the choice of  − ∈ J().
Let us consider the comparison between the AURPCE and AURE and the AULPCE and AULE, respectively.From ( 12)-( 14), we have where .Now, we will use Lemma 1 to discuss the differences Δ 1 and Δ 2 following Sarkar [14] and Xu and Yang [11].Since we assume that     −1  − = 0 and   −  −1  − is invertible; then Meanwhile, it is noted that the assumptions are reasonable which is equivalent to the partitioned matrix ), that is, a block diagonal matrix and the second main diagonal being invertible.
Theorem 2. Suppose that     −1  − = 0 and   −  −1  − is invertible; then the AURPCE is superior to the AURE if and only if  ∈ (), where  =  −1 (  −  −1  − ) −1/2   − . Proof.Since then we have And the Moore-Penrose inverse , is a positive definition matrix since Λ − is supposed to be invertible and where where Proof.In order to apply Lemma 1, we can similarly compute that Therefore, the Moore-Penrose inverse  + 2 of  2 is given by Since  2  + 2 =  −   − , then  3 ∉ M().Moreover, where , it is concluded that  = 1 in our case.Thus, it follows from Lemma 1 that the βAU (, ) is superior to βAU () in the MSEM sense if and only if where  =  −1 (  −  −1  − ) −1/2   − , thus the necessary and sufficient condition turns out to be  ∈ ().

Monte Carlo Simulation
In order to illustrate the behaviour of the AURPCE and AULPCE, we perform a Monte Carlo simulation study.Following the way of Li and Yang [15], the explanatory variables and the observations on the dependent variable are generated by where   are independent standard normal pseudorandom numbers and  is specified so that the correlation between any two explanatory variables is given by  2 .In this experiment, we choose  = 2 and  2 = 1.Let us consider the AURPCE, AULPCE, AURE, AULE, PCRE, and OLSE and compute their respective estimated MSE values with the different levels of multicollinearity, namely,  = 0.7, 0.85, 0.9, 0.999 to show the weakly, strong, and severely collinear relationships between the explanatory variables (see Tables 1 and 2).Furthermore, for the convenience of comparison, we plot the estimated MSE values of the estimators when  = 0.999 in Figure 1.
From the simulation results shown in Tables 1 and 2 and the estimated MSE values of these estimators, we can see

Conclusion
In this paper, we introduce two classes of new biased estimators to provide an alternative method of dealing with multicollinearity in the linear model.We also show that our new estimators are superior to the competitors in the MSEM  criterion under some conditions.Finally, a Monte Carlo simulation study is given to illustrate the better performance of the proposed estimators.

Table 2 :
MSE values of the OLSE, PCRE, AULE, and AULPCE.formostcases, the AURPCE and AULPCE have smaller estimated MSE values than those of the AURE, AULE, PCRE, and OLSE, respectively, which agree with our theoretical findings.From Figure1, the AURPCE and AULPCE also have more stable and smaller estimated MSE values.We can see that our estimator is meaningful in practice. that