On the Stochastic Restricted r-k Class Estimator and Stochastic Restricted r-d Class Estimator in Linear Regression Model

The stochastic restricted r-k class estimator and stochastic restricted r-d class estimator are proposed for the vector of parameters in a multiple linear regression model with stochastic linear restrictions. The mean squared error matrix of the proposed estimators is derived and compared, and some properties of the proposed estimators are also discussed. Finally, a numerical example is given to show some of the theoretical results.


Introduction
The problem of multicollinearity or the ill-conditioned design matrix in linear regression model is very well known in statistics.In order to overcome this problem, different remedies have been introduced.One of the most important estimation methods is to consider biased estimators, such as the principal component regression (PCR) estimator [1], the ridge estimator (ORE) by Hoerl and Kennard [2], the - class estimator [3], the Liu estimator (LE) by Liu [4], the - class estimator [5], the -- class estimator [6], and the principal component Liu-type estimator [7].
An alternative method to deal with multicollinearity problem is to consider parameter estimation with some restrictions on the unknown parameters, which may be exact or stochastic restrictions [8].When stochastic additional restrictions on the parameter vector are supposed to hold, Durbin [9], Theil and Goldberger [10], and Theil [11] proposed the ordinary mixed estimator (OME).By grafting the ordinary regression ridge estimator and LE into the mixed estimation, Li and Yang [12] and Hubert and Wijekoon [13] introduced a stochastic restricted ridge estimator (SRRE) and stochastic restricted Liu estimator (SRLE), respectively, and Liu et al. [14] proposed the weighted mixed almost unbiased ridge estimator in linear regression model.
In this paper, in order to overcome multicollinearity, we introduce a stochastic restricted - class estimator and a stochastic restricted - class estimator for the vector of parameters in a linear regression model when additional stochastic linear restrictions are assumed to hold.Performance of the proposed estimators with respect to the mean squared error matrix (MSEM) criterion is discussed.
The rest of the paper is organized as follows.The model specifications and the new estimators are introduced in Section 2.Then, the superiority of the proposed estimators is discussed in Section 3 and a numerical example is given to illustrate the behavior of the estimators in Section 4. Finally, some conclusion remarks are given in Section 5.

Model Specifications and the Estimators
Consider the linear regression model where  is an  × 1 vector of observation,  is an  ×  known design matrix of rank ,  is a  × 1 vector of unknown parameters, and  is an  × 1 vector of disturbances with expectation () = 0 and variance-covariance matrix Cov() =  2   .
For the unrestricted model given by (1), the ORE proposed by Hoerl and Kennard [2] and the LE presented by Liu [4] are defined as follows: where  > 0, 0 <  < 1,  =   ,   =  + ,   =  −1  ,   = ( + ) −1 ( + ), and βOLS =  −1    is the ordinary least squares (OLS) estimator of .Now let us consider the spectral decomposition of the matrix given as where Λ  and Λ − are diagonal matrices such that that the main diagonal elements of the The - class estimator proposed by Baye and Parker [3] and the - class estimator proposed by Kac ¸ıranlar and Sakallıoglu [5]  ( Followed by Xu and Yang [15], the - class estimator and the - class estimator could be rewritten as follows: In addition to the model (1), let us be given some prior information about  in the form of a set of  independent stochastic linear restrictions as follows: where  is a  ×  known matrix of rank ,  is a  × 1 vector of disturbances with mean 0 and dispersion matrix  2 ,  is supposed to be known and positive definite, and the ×1 vector  can be interpreted as a random variable with expectation () = .Therefore the restriction (9) does not hold exactly but in the mean, and we suppose  to be known, that is, to be the realized value of the random vector, so that all the expectations are conditional on  [8].In the following discussions, we do not mention this separately.Furthermore, it is also supposed that the random vector  is stochastically independent of .
At the end of this section, we will list some lemmas which are needed in the following proofs.
Lemma 1 (see [8]).Assume that square matrices  and  are not singular and  and  are matrices with proper orders; then Lemma 2 (see [16]).Let  be a nonnegative definite matrix, namely,  ≥ 0 and let  be some vector; then  −   ≥ 0 if and only if    +  ≤ 1,  ∈ R().
In the next section, we will make comparison of the biased estimators.

Model Specifications and the Estimators
The mean squared error matrix (MSEM) of an estimator β is defined as where Cov( β) is the dispersion matrix and Bias( β) is the bias vector.For the two given estimators β1 , β2 , the estimator β2 is said to be superior to the estimator β1 in the matrix MSE criterion if and only if Note that the MSEM criterion is always superior to the scalar mean squared error criterion (MSE); we only consider the MSEM comparisons among the estimators.

MSEM Comparisons of the 𝑟-𝑘 Class Estimator and the Stochastic
Restricted - (SRrk) Class Estimator.In this subsection, we consider the MSEM comparison between the - class estimator and the stochastic restricted - (SRrk) class estimator.
Firstly, we can compute the bias vector and the variance of stochastic restricted - (SRrk) class estimator as follows: where  = ( +    −1 ) −1 .Therefore, the MSEM of the stochastic restricted - class estimator is given by From ( 6), we can compute the bias vector and the variance of - class estimator as follows: Therefore, the MSEM of the - class estimator is given by Now we consider the following difference of the MSEM: Theorem 4. The stochastic restricted - class estimator βSRrk (, ) always dominates the - class estimator β (, ) in the MSEM criterion.
Proof.By Lemma 1, we can obtain , we may conclude that Δ 1 ≥ 0; that is to say, the stochastic restricted - class estimator βSR (, ) always dominates the - class estimator β (, ) in the MSEM criterion.

MSEM Comparisons of the Stochastic Restricted Ridge
Estimator (SRRE) and the Stochastic Restricted - (SRrk) Class Estimator.In this subsection, we consider the MSEM comparison between the stochastic restricted ridge estimator (SRRE) and the stochastic restricted - (SRrk) class estimator.Firstly, from (9), we can compute the bias vector and the variance of stochastic restricted ridge estimator (SRRE) as follows: Then, we can obtain the MSEM of the SRRE as follows: Now let us consider the following difference: where  =     −               ,  1 = (  − ) and  2 = (       − ).
On the other hand,               can be rewritten as ).This theorem is proved.(24)

MSEM Comparisons of the
From ( 6), we can compute the bias vector and the variance of the - class estimator as follows: Then, we can obtain the MSEM of the - class estimator as follows: Now we consider the following difference: Theorem 6.The stochastic restricted - class estimator βSRrd (, ) always dominates the - class estimator β (, ) in the MSEM criterion.

MSEM Comparisons of the Stochastic Restricted Liu Estimator (SRLE) and the
Remark.For practical use, we may replace the unknown parameters in the theorems with their appropriate estimators.

Numerical Example
In order to illustrate our theoretical results, we now consider in this section the data set on Total National Research and Development Expenditure as a Percent of Gross National Product originally due to Gruber [18] and later considered by Akdeniz and Erol [19].In this paper, we use the same data, which is presented in Table 1.Firstly, we obtain the ordinary least squares estimator of : βOLS =  −1    = (0.6455, 0.0896, 0.1436, 0.1526)  (32) with MSE( βOLS ) = 0.0808 and σ2 OLS = 0.0015.Consider the following stochastic linear restrictions:  =  + ,  = (1, −2, −2, −2) ,  ∼ (0, σ2 OLS ) .(33) For the - class estimator (rk), the stochastic restricted ridge estimator (SRRE), and the stochastic restricted - class estimator (SRrk), their estimated mean squared error values are given in Table 2.For the - class estimator (rd), the stochastic restricted Liu estimator (SRLE) and the stochastic restricted - class estimator (SRrd), their estimated mean squared error values are given in Table 3. Their estimated mean squared error values are got by replacing in the corresponding theoretical MSE expressions all unknown model parameters by their OLS estimator.
From Table 2, we can see that the stochastic restricted - class estimator is always better than the - class estimator.As fact, the stochastic restricted - class estimator has more information about the unknown parameter, so this estimator is better than the - class estimator.When  is small, then the stochastic restricted - class estimator is superior over the stochastic restricted ridge estimator.However, when  becomes big, then the stochastic restricted ridge estimator is superior over the stochastic restricted - class estimator.
From Table 3, we can find that stochastic restricted - class estimator is always better than the - class estimator.When  is big, then the stochastic restricted - class estimator is superior over the stochastic restricted Liu estimator.However, when  becomes smaller, then the stochastic restricted Liu estimator is superior over the stochastic restricted - class estimator.