MSE Modelling and Simulation in Engineering 1687-5605 1687-5591 Hindawi 10.1155/2019/6342702 6342702 Research Article A Modified New Two-Parameter Estimator in a Linear Regression Model http://orcid.org/0000-0003-2881-1297 Lukman Adewale F. 1 Ayinde Kayode 2 Siok Kun Sek 3 Adewuyi Emmanuel T. 2 Qiu Zhiping 1 Department of Physical Sciences Landmark University Omu-Aran Nigeria lmu.edu.ng 2 Department of Statistics Federal University of Technology Akure Nigeria futa.edu.ng 3 School of Mathematical Sciences Universiti Sains Malaysia Malaysia usm.my 2019 2652019 2019 31 12 2018 25 02 2019 05 03 2019 2652019 2019 Copyright © 2019 Adewale F. Lukman et al. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

The literature has shown that ordinary least squares estimator (OLSE) is not best when the explanatory variables are related, that is, when multicollinearity is present. This estimator becomes unstable and gives a misleading conclusion. In this study, a modified new two-parameter estimator based on prior information for the vector of parameters is proposed to circumvent the problem of multicollinearity. This new estimator includes the special cases of the ordinary least squares estimator (OLSE), the ridge estimator (RRE), the Liu estimator (LE), the modified ridge estimator (MRE), and the modified Liu estimator (MLE). Furthermore, the superiority of the new estimator over OLSE, RRE, LE, MRE, MLE, and the two-parameter estimator proposed by Ozkale and Kaciranlar (2007) was obtained by using the mean squared error matrix criterion. In conclusion, a numerical example and a simulation study were conducted to illustrate the theoretical results.

1. Introduction

The general linear regression model in matrix form is defined as(1)y=Xβ+ε,where y is a n×1 vector of the dependent variable, X is a known n×p full-rank matrix of explanatory variables, β is a p×1 vector of regression coefficients, and ε is n×1 vector of disturbance such that Eε=0 and Covε=σ2I. The ordinary least squares estimator (OLSE) of β in model (1) is defined as(2)β^OLS=XX1Xy.

According to the Gauss–Markov theorem, the OLS estimator is considered best, linear, and unbiased, possessing minimum variance in the class of all linear unbiased estimators. However, different studies have shown that the OLS estimator is not best when the explanatory variables are related, that is, when multicollinearity is present . This estimator becomes unstable and gives a misleading conclusion. Many biased estimators have been proposed as an alternative to OLSE to circumvent this problem. These include Stein estimator , principal components estimator , ridge estimator (RRE) estimator , contraction estimator , modified ridge regression estimator (MRRE) , and Liu estimator .

Hoerl and Kennard  proposed a ridge estimator (RRE)(3)β^RREk=XX+kI1XXβ^OLSy=Tkβ^OLS,k>0,where Tk=XX+kI1XX. β^RREk was obtained by augmenting the equation 0=k1/2β+ε to the original equation (1) and then applying the OLS estimator. Mayer and Willke  defined the contraction estimator(4)β^ρ=1+ρ1β^,ρ>0.

Liu  combined the Stein estimator with a ridge estimator to combat the problem of multicollinearity. β^LEd was obtained by augmenting the equation dβ^=β+ε to the original equation (1) and then applying OLS. This is defined as follows:(5)β^LEd=XX+I1Xy+dIβ^OLS=Tdβ^OLS,0<d<1,where Td=XX+I1Xy+dI.

Swindel  modified the ridge estimator (MRRE) by adding a prior information. The estimator is defined as follows:(6)β^MRREk,b=XX+kI1Xy+kb,where b represent the prior information on β. MRRE tends to b as k tends to infinity. Also, MRRE returns the estimates of the OLS estimator when k = 0.

Based on prior information, Li and Yang  proposed a modified Liu estimator (MLE):(7)β^MLEd,b=XX+I1XX+dIβ^OLS+1db.

MLE includes OLS and Liu as special cases. In recent times, different researchers have suggested the use of two-parameter estimators to handle multicollinearity. Ozkale and Kaciranlar  proposed the two-parameter estimator (TPE), which is defined as(8)β^TPEk,d=XX+kI1XY+kdβ^OLS=XX+kI1XX+kdβ^OLS=Tkdβ^,where k>0,0<d<1. TPE includes OLS, RRE, LE, and the contraction estimators as special cases.

The primary focus of this study is to provide an alternative method in a linear regression model to circumvent the problem of multicollinearity. A modified two-parameter (MTP) estimator is proposed based on prior information and is compared with OLS, LE, RRE, MRRE, MLE, and TPE, respectively, using the mean squared error matrix (MSEM) criterion. The article is structured as follows: We introduce the new estimator in Section 2. In Section 3, we discuss the superiority of the new estimator. Section 4 consists of a numerical example and a simulation study. Concluding remarks are provided in Section 5.

2. Modified Two-Parameter Estimator

Let Tk=XX+kI1XX=IkXX+kI1, and MRRE in equation (6) can be re-expressed as(9)β^MRREk,b=XX+kI1Xy+kXX+kI1b=XX+kI1XXβ^OLS+kXX+kI1b=Tkβ^OLS+ITkb.

Similarly, Td=XX+I1XX+dI, and then the modified Liu estimator in equation (7) can be written as(10)β^MLEd,b=Tdβ^OLS+ITdb=XX+I1XX+dIβ^OLS+1db.

MRRE and MLE are the convex combination of the prior information b and the OLS estimator. From equation (8), Tkd=XX+kI1XX+kdI=Ik1dXX+kI1; therefore, the modified two-parameter based on the prior information can be defined as follows:(11)β^MTPEk,d,b=Tkdβ^OLSITkdb=XX+kI1XX+kdIβ^OLS+IXX+kI1XX+kdIb=XX+kI1XX+kdIβ^OLS+k1dXX+kI1b=XX+kI1XX+kdIβ^OLS+k1db.

Also, MTPE is a convex combination of the prior information and OLSE. It includes the special cases of OLSE, RRE, MRE, LE, and MLE. The following cases are possible:

β^MTPEk,1,b0=β^MTPE0,d,b0=β^OLS; ordinary least squares estimator

β^MTPE1,d,0=β^LEd; Liu estimator

β^MTPEk,0,b0=β^MRREk,b0; modified ridge estimator

β^MTPEk,0,0=β^RREk; ridge estimator

β^MTPE1,d,b0=β^MLEd,b0; modified Liu estimator

Suppose there exist an orthogonal matrix T such that TXXT=Λ=diagλ1,λ2,,λp, where λi is the ith eigenvalue of XX. Λ and T are the matrices of eigenvalues and eigenvectors of XX, respectively. Substituting Z=XQ, α=Qβ in model (1), then the equivalent model can be rewritten as(12)y=Zα+ε.

The following representations of the estimators are as follows:(13)α^OLS=Λ1ZY,α^LEd=Λ+I1Λ+dIΛ1Zy,α^RREk=Λ+kI1Zy,α^MRREk,b=Λ+kI1Zy+kb,α^MLEd,b=Λ+I1Λ+dIΛ1Zy+1db,α^TPEk,d=Λ+kI1Λ+kdIΛ1Zy,α^MTPEk,d,b=Λ+kI1Λ+kdIΛ1Zy+k1db.

The following notations and lemmas are needful to prove the statistical property of β^MTPEk,d,b0.

Lemma 1.

Let M be an n×n positive definite matrix, that is, M > 0, and α be some vector, then Mαα0 if and only if αM1α1 .

Lemma 2.

Let β^i=Aiy,i=1,2 be two linear estimators of β. Suppose that D=Cov β^1Cov β^2>0, where Cov β^i,i=1,2 denotes the covariance matrix of β^i and bi=Bias β^i=AiXIβ,i=1,2. Consequently,(14)Δβ^1β^2=MSEM β^1MSEM β^2=σ2D+b1b1b2b2>0,if and only if b2σ2D+b1b11b2<1, where MSEMβ^i=Covβ^i+bibi .

3. Establishing Superiority of Modified Two-Parameter Estimator Using MSEM Criterion

In this section, MTPE is compared with the following estimators: OLS, RRE, LE, MRRE, MLE, and TPE.

3.1. Comparison between the MTPE and OLS Using MSEM Criterion

From the representation α^MTPEk,d,b=Λ+kI1Λ+kdIΛ1Zy+k1db, the bias vector and covariance matrix of MTPE are obtained as follows:(15)Eα^MTPEk,d,b=EΛ+kI1Λ+kdIΛ1Zy+k1db=Λ+kI1Λ+kdIΛ1ZZα+k1db=Λ+kI1Λ+kdIα+k1db=Λ+kI1Λ+kdIα+Λ+kI1k1db,where Ey=Zα.

Recall that k1d=Λ+kIΛ+kdI and let Bk,d,b=Λ+kI1Λ+kdI. Therefore,(16)Eα^MTPEk,d,b=Bk,d,bα+IBk,d,bb,bias α^MTPEk,d,b=Bk,d,bα+IBk,d,bbα,=Bk,d,bαIαb,(17)Cov α^MTPEk,d,b=σ2Bk,d,bΛ1Bk,d,b.

Hence,(18)MSEM α^MTPEk,d,b=σ2Bk,d,bΛ1Bk,d,b+Bk,d,bIαbαbBk,d,bI.

From the representation, α^=Λ1ZY, the MSEM of OLS is given as(19)MSEM α^OLS=σ2Λ1.

Comparing (18) and (19),(20)MSEM α^OLSMSEM α^MTPEk,d,b=σ2Λ1Bk,d,bΛ1Bk,d,b+Bk,d,bIαbαbBk,d,bI.

Let k > 0 and 0 < d < 1. Thus, the following theorem holds.

Theorem 3.

Consider two biased competing homogenous linear estimators α^OLS and α^MTPEk,d,b. If k > 0 and 0<d<1, the estimator α^MTPEk,d,b is superior to estimator α^ using the MSEM criterion, that is, MSEMα^OLSMSEMα^MTPEk,d,b>0 if and only if(21)αbBk,d,bIσ2Λ1Bk,d,bΛ1Bk,d,b1Bk,d,bIαb<1.

Proof.

Using (17) and (19), the following was obtained:(22)Cov α^Cov α^MTPEk,d,b=σ2Λ1Bk,d,bΛ1Bk,d,b=σ2diag 1λiλi+kd2λiλi+k2i=1p.Λ1Bk,d,bΛ1Bk,d,b will be positive definite (pd) if and only if λi+k2λi+kd2>0 or λi+kλi+kd>0. It was observed that λi+kλi+kd=k1d>0 for 0<d<1 and k > 0. Therefore, Λ1Bk,d,bΛ1Bk,d,b is pd. By Lemma 2, the proof is completed.

3.2. Comparison between the MTPE and RRE Using MSEM Criterion

From the representation, α^RREk=Λ+kI1Zy, the bias vector and covariance matrix of RRE is given as follows:(23)bias α^RREk=kΛ+kI1α,Cov α^RREk=σ2Λ+kI1ΛΛ+kI1.

Hence,(24)MSEM α^RREk=σ2BkΛBk+k2BkααBk,where Bk=Λ+kI1. The difference between α^RREk and α^MTPEk,d,b in the MSEM sense is as follows:(25)MSEM α^RREkMSEM α^MTPEk,d,b=σ2BkΛBkBk,d,bΛ1Bk,d,b+k2BkααBkBk,d,bIαbαbBk,d,bI.

Let k > 0 and 0 < d < 1. Thus, the following theorem holds.

Theorem 4.

Consider two biased competing homogenous linear estimators α^RREk and α^MTPEk,d,b. If k > 0 and 0<d<1, the estimator α^MTPEk,d,b is superior to estimator α^RREk using the MSEM criterion, that is, MSEM α^RREkMSEM α^MTPEk,d,b>0 if and only if(26)αbBk,d,bIσ2BkΛBkBk,d,bΛ1Bk,d,b+k2BkααBk1Bk,d,bIαb<1.

3.3. Comparison between the MTPE and LE Using MSEM Criterion

From the representation, α^LEd=Λ+I1Λ+dIΛ1Zy, the bias vector and covariance matrix of RRE are provided as follows:(27)bias α^LEd=BdIα,(28)Cov α^LEd=σ2BdΛ1Bd.

Hence,(29)MSEM α^LEd=σ2BdΛ1Bd+BdIααBdI,where Bd=Λ+I1Λ+dI. Considering the difference between (18) and (29),(30)MSEM α^LEdMSEMα^MTPEk,d,b=σ2D+b1b1b2b2,where D=BdΛ1BdBk,d,bΛ1Bk,d,b, b1=BdIα, and b2=Bk,d,bIαb.

Theorem 5.

Consider two biased competing homogenous linear estimators α^LEd and α^MTPEk,d,b. If k > 0 and 0<d<1, the estimator α^MTPEk,d,b is superior to estimator α^LEd using the MSEM criterion, that is, MSEMα^LEdMSEMα^MTPEk,d,b>0 if and only if(31)αbBk,d,bIσ2BdΛ1BdBk,d,bΛ1Bk,d,b+BdIααBdI1Bk,d,bIαb<1.

Proof.

Using (17) and (28), the following was obtained:

By computation,(32)σ2D=σ2BdΛ1BdBk,d,bΛ1Bk,d,b=σ2diag λi+d2λiλi+12λi+kd2λiλi+k2.

B d Λ 1 B d B k , d , b Λ 1 B k , d , b will be positive definite (pd) if and only if, λi+d2λi+k2λi+12λi+kd2>0. For 0<d<1 and k > 1, it was observed that λi+d2λi+k2λi+12λi+kd2>0. Therefore, BdΛ1BdBk,d,bΛ1Bk,d,b is pd. By Lemma 2, the proof is completed.

3.4. Comparison between the MTPE and MRRE Using MSEM Criterion

From the representation, α^MRREk,b=Λ+kI1Zy+kb, the bias vector and covariance matrix of MRRE are provided as follows:(33)bias α^MRREk,b=BkIαb,(34)Cov α^MRREk,b=σ2BkΛ1Bk.

Hence,(35)MSEM α^MRREk,b=σ2BkΛ1Bk+BkIαbαbBkI,where Bk=Λ+kI1. Considering the difference between (18) and (35),(36)MSEM α^MRREkMSEM α^MTPEk,d,b=σ2BkΛ1BkBk,d,bΛ1Bk,d,b+BkIαbαbBkIBk,d,bIαbαbBk,d,bI.

Theorem 6.

Consider two biased competing homogenous linear estimators α^MRREk,b and α^MTPEk,d,b. If k > 0 and 0<d<1, the estimator α^MTPEk,d,b is superior to the estimator α^MRREk,b using the MSEM criterion, that is, MSEMα^MRREk,bMSEMα^MTPEk,d,b>0 if and only if(37)Bk,d,bIαbσ2BkΛ1BkBk,d,bΛ1Bk,d,b+BkIαbαbBkI1Bk,d,bIαb<1.

Proof.

Using (17) and (34), the following was obtained:(38)σ2BkΛ1BkBk,d,bΛ1Bk,d,b=σ2diag λiλi+k2λi+kd2λiλi+k2.

Evidently, for 0<d<1 and k > 0, BkΛ1BkBk,d,bΛ1Bk,d,b will be positive definite (pd). By Lemma 2, the proof is completed.

3.5. Comparison between the MTPE and MLE Using MSEM Criterion

From the representation, α^MLEd,b=Λ+I1Λ+dIΛ1Zy+1db, the bias vector and covariance matrix of MLE are provided as follows:(39)bias α^MLEd=BdIαb,(40)Cov α^MLEd=σ2BdΛ1Bd.

Hence,(41)MSEM α^MLEd=σ2BdΛ1Bd+BdIαbαbBdI,where Bd=Λ+I1Λ+dI. The mean square error difference between (18) and (41) is given as(42)Δ1=MSEM α^MLEdMSEM α^MTPEk,d,b=σ2D+b1b1b2b2,where D=BdΛ1BdBk,d,bΛ1Bk,d,b, b1=BdIαb, b2=Bk,d,bIαb.

Theorem 7.

Consider two biased competing homogenous linear estimators α^MLEd,b and α^MTPEk,d,b. If k > 0 and 0<d<1, the estimator α^MTPEk,d,b is superior to the estimator α^MLEd,b using the MSEM criterion, that is, MSEM α^MLEd,bMSEM α^MTPEk,d,b>0 if and only if b2σ2D+b1b11b2<1, where D=BdΛ1BdBk,d,bΛ1Bk,d,b, b1=BdIαb, and b2=Bk,d,bIαb.

Proof.

Using (17) and (40), the following was obtained:

By computation,(43)D=BdΛ1BdBk,d,bΛ1Bk,d,b=Q diag τ1,,τpQ.

By computation,(44)σ2D=σ2BdΛ1BdBk,d,bΛ1Bk,d,b=σ2 diag λi+d2λiλi+12λi+kd2λiλi+k2.

σ 2 B d Λ 1 B d B k , d , b Λ 1 B k , d , b will be positive definite if and only if λi+d2λi+k2λi+kd2λi+12>0.

3.6. Comparison between the MTPE and TPE Using MSEM Criterion

From the representation α^TPEk,d=Λ+kI1Λ+kdIΛ1Zy, the bias vector and covariance matrix of TPE are provided as follows:(45)biasα^TPEk,d=Bk,d,bαIα,(46)Cov α^TPEk,d=σ2Bk,d,bΛ1Bk,d,b.

Hence,(47)MSEM α^TPEk,d=σ2Bk,d,bΛ1Bk,d,b+Bk,d,bIααBk,d,bI.

Considering the matrix difference between (18) and (47)(48)Δ2=MSEM α^TPEk,dMSEM α^MTPEk,d,b=Bk,d,bIαααbαbBk,d,bI.

Obviously, Δ20 if and only if αααbαb0; thus, the following results hold.

Theorem 8.

The modified two-parameter estimator α^MTPEk,d,b is superior to the two-parameter estimator α^TPEk,d in the MSEM sense if and only if αααbαb0.

4. Selection of Bias Parameters

Selecting an appropriate parameter is crucial in this study. The use of the Ridge estimator largely depends on the ridge parameter, k. Several methods for estimating this ridge parameter have been proposed. This includes Hoerl and Kennard , Kibria , Muniz and Kibria , Aslam , Dorugade , Kibria and Banik , Lukman and Ayinde , Lukman et al. , and others. For the purpose of practical application of this new estimator, the optimum values of k and d are obtained. In order to obtain an optimum value of k, we assume the value of d is fixed.

Recall from equation (18),(49)Δ=MSEM α^MTPEk,d,b=σ2Bk,d,bΛ1Bk,d,b+Bk,d,bIαbαbBk,d,bI=σ2i=1pλi+kd2λiλi+k2+k2d12i=1pαib2λi+k2.

Differentiating equation (49) with respect to k gives the following result:(50)Δk=2σ2i=1pλi1dλi+kdλiλi+k3+2kd12i=1pλiαib2λi+k3.

Let Δ/k=0, the value of k is as follows:(51)k=σ2λiλiαib2dλiαib2+σ2,σ2 and αi are replaced by their unbiased estimators σ^2 and α^i. The harmonic mean version is defined as(52)k^HMP=pi=1p1/k^,where k^=σ^2λi/λiα^ib2d^λiα^ib2+σ^2.

Recall that α^MTPEk,0,0=α^RREk, considering this special case implies that k^ in equation (51) will become(53)k^=σ^2α^i2,which is the estimated value of k introduced by Hoerl and Kennard . Hoerl et al.  defined the harmonic version of the ridge parameter, k, as follows:(54)k^HKB=pσ^2ipα^i2.

The optimum value of d is obtained by differentiating equation (49) with respect to d with fixed k. The result is as follows:(55)Δd=2kσ2i=1pλi+kdλiλi+k2+2k2d1i=1pαib2λi+k2.

Let Δ/d=0, the value of d is as follows:(56)dMTPE=i=1pkλiαb2σ2λii=1pσ2k+kλiαb2.σ2 and αi are replaced by their unbiased estimators σ^2 and α^i. Recall that α^MTPE1,d,0=α^LEk, considering this special case implies that d in equation (54) will become(57)dliu=i=1pαi2σ^2λii=1pσ2+λiαi2.

Equation (57) is the same as the optimum value of d proposed by Liu , which is defined as follows:(58)dopt=i=1pαi2σ^2/λi+12i=1pσ2+λiαi2/λiλi+12.

Theorem 9.

If(59)d^<minλiαib2λiαib2+σ2,for all i, then k^ are always positive.

Proof.

The values of k in (51) are always positive if σ2λi/λiαib2dλiαib2+σ2>0. Since σ2λi>0,λiαib2dλiαib2+σ2 must be positive for all i, it is observed that d<λiαib2/λiαib2+σ2 for all i. This inequality depends on the unknown parameters σ2 and αi which is replaced by their unbiased estimators σ^2 and α^i.

The selection of the estimator of the parameters d and k in α^MTPEk,d,b can be obtained iteratively as follows:

Step 1: calculate d^ from (59).

Step 2: estimate k^HKB by using d^ in step 1.

Step 3: estimate d^MTPE from (56) by using the estimator k^HKB in step 2.

Step 4: if d^MTPE is negative use d^MTPE=d^, d^MTPE can take negative value. However, d^ takes value between 0 and 1.

5. Numerical Example and Monte-Carlo Simulation

Hussain dataset which was originally adopted by Eledum and Zahri  is used in this study to illustrate the performance of the new estimator. The dataset was also adopted in the study of Lukman et al. . This is provided in Table 1. The regression model is defined as follows:(60)yi=β0+β1X1+β2X2+β3X3+εi,where yi represents the product value in the manufacturing sector, X1 the values of the imported intermediate commodities, X2 imported capital commodities, X3 represents the value of imported raw materials. The variance inflation factors are VIF1=128.29, VIF2=103.43, and VIF3=70.87. λ4=105.419 and the condition number of XX is approximately 5660049. The variance inflation factor and the condition number both indicate the presence of severe multicollinearity.

y X 1 X 2 X 3
115.20 38.10 30.40 7.00
134.30 39.20 32.40 12.50
151.00 36.30 31.40 2.30
169.00 31.10 28.40 3.60
170.80 40.00 31.40 7.00
187.50 55.00 37.00 6.00
205.20 55.00 50.00 4.00
235.70 47.00 42.00 8.00
257.70 47.00 28.10 8.70
276.70 50.00 44.70 4.50
327.00 69.00 50.00 8.50
353.80 85.00 61.40 39.20
419.50 88.00 76.10 17.70
489.00 91.00 88.70 32.90
594.90 285.00 203.30 121.00
807.60 448.00 615.90 133.90
1014.00 324.00 562.10 82.50
1208.00 281.00 716.00 99.30
1380.00 349.00 771.30 103.90
1518.00 508.40 807.40 87.70
1763.00 533.20 1222.00 217.10
1914.00 592.80 1188.00 184.90
2338.00 726.40 1478.00 227.90
2275.00 706.30 1434.00 221.40
2562.00 796.70 1630.00 250.50
2750.00 856.00 1759.00 269.50
3000.00 934.90 1930.00 294.90
2859.00 890.30 1833.00 280.60
3794.00 1185.00 2472.00 375.20
4848.00 1696.00 3581.00 539.50
4048.00 1458.00 3065.00 463.00

The prior information of b = 0.95 β^ as used in the study of Li and Yang  is adopted. The estimated mean square values of the estimators OLSE, RRE, LE, MRRE, MLE, TPE, and MTPE are provided in Table 2. The values of k and d were computed using the estimators of k and d proposed in this study. k and d in equations (52) and (56) are obtained to be 1036.427 and 0.0043, respectively. From both tables, OLSE has the least performance among all the estimators. It was observed from Table 2 that the modified estimators (MLE, MRRE, and MTPE) outperform their counterparts. However, the proposed estimator MTPE outperforms other estimators.

Estimated regression coefficients and mean square error of estimators.

Estimates Estimators
OLSE RRE LE MRRE MLE TPE MTPE
β ^ 0 208.87 207.12 191.96 198.54 208.87 207.13 195.81
β ^ 1 −1.314 −1.314 −1.314 −1.314 −1.314 −1.314 −1.314
β ^ 2 1.515 1.515 1.515 1.514 1.515 1.515 1.514
β ^ 3 −2.017 −2.017 −2.006 −2.006 −2.017 −2.017 −2.004
MSE 1850.48 1822.65 1849.39 109.08 1564.06 1822.76 108.37

Also, we conducted a Monte-Carlo simulation study to examine the performances of the estimators further. The simulation procedure used by Lukman and Ayinde  was also used to generate the explanatory variables in this study. This is given as(61)Xij=1γ21/2zij+γzip,i=1,2,,n,j=1,2,,p,where zij is independent standard normal distribution with mean zero and unit variance, γ2 is the correlation between any two explanatory variables, and p is the number of explanatory variables. The values of γ were taken as 0.85, 0.9, and 0.99, respectively. In this study, the number of explanatory variable (p) was taken to be four.

The dependent variable is generated as follows:(62)yi=β1X1+β2X2+β3X3+β4X4+εi,where εi0,σ2. The parameter values were chosen such that ββ = 1 which is a common restriction in simulation studies of this type . The values of β are taken to be β1 = 0.8, β2 = 0.1, and β3 = 0.6. Sample sizes 50 and 100 were used. Three different values of σ (0.01, 0.1, and 1) were also used. The experiment is replicated 5000 times. The estimated MSE is calculated as(63)MSEβ^=15000j=15000β^ijβiβ^ijβi,where β^ij denotes the estimate of the ith parameter in the jth replication and βi is the true parameter values. The estimated MSEs of the estimators for different values of n, p, σ, and γ are shown in Tables 36. The results from the simulation study show that the estimated MSE increases as the level of error variance increases. We observed that as the degree of multicollinearity (ρ) increases, the estimated MSEs also increase. Also, RRE, MRRE, LE, MLE, TPE, and MTPE have smaller MSE than the OLS estimator. The proposed estimator MTPE outperforms other estimators depending on the choice of prior information. The results of the simulation study support the real-life analysis in this paper.

Estimated MSE values of the OLSE, RRE, MRRE, LE, and MLE when n = 50.

Estimators ρ = 0.85
k = 0.01 k = 0.05 k = 0.1
σ  = 0.01 σ  = 0.1 σ  = 1 σ  = 0.01 σ  = 0.1 σ  = 1 σ  = 0.01 σ  = 0.1 σ  = 1
OLS 3.46E − 05 0.003874 0.333639 3.46E − 05 0.003874 0.333639 3.46E − 05 0.003874 0.333639
RRE 2.52E − 05 0.003185 0.311938 3.14E − 05 0.003145 0.319639 3.08E − 05 0.002959 0.297564
MRRE 1.57E − 05 1.98E − 03 1.87E − 01 1.95E − 05 1.95E − 03 1.99E − 01 1.91E − 05 1.84E − 03 2.37E − 01
ρ = 0.9
OLS 7.76E − 05 0.007438 0.719922 7.76E − 05 0.007438 0.719922 7.76E − 05 0.007438 0.719922
RRE 6.79E − 05 0.006505 0.629549 6.26E − 05 0.006321 0.629699 5.89E − 05 0.00608 0.579048
MRRE 4.22E − 05 4.04E − 03 3.78E − 01 3.89E − 05 3.93E − 03 3.91E − 01 3.66E − 05 3.78E − 03 4.61E − 01
ρ = 0.99
OLS 0.004043 0.408836 40.89173 0.004043 0.408836 40.89173 0.004043 0.408836 40.89173
RRE 0.002534 0.26822 24.79705 0.001099 0.109383 10.50059 0.00064 0.062891 6.359499
MRRE 0.001571 0.166296 15.37417 0.000681 0.067817 6.510364 0.000397 0.038992 3.942889

Estimators ρ = 0.85
d = 0.01 d = 0.05 d = 0.1
0.01 0.1 1 0.01 0.1 1 0.01 0.1 1

OLS 3.46E − 05 0.003874 0.333639 3.46E − 05 0.003874 0.333639 3.46E − 05 0.003874 0.333639
LE 3.15E − 05 2.02E − 03 1.92E − 01 2.97E − 05 2.00E − 03 2.11E − 01 3.03E − 05 2.18E − 03 1.96E − 01
MLE 1.96E − 05 1.25E − 03 1.20E − 01 1.85E − 05 1.25E − 03 1.31E − 01 1.88E − 05 1.35E − 03 1.22E − 01
ρ = 0.9
OLS 7.76E − 05 0.007438 0.719922 7.76E − 05 0.007438 0.719922 7.76E − 05 0.007438 0.719922
LE 4.75E − 05 3.31E − 03 3.30E − 01 4.66E − 05 3.43E − 03 3.23E − 01 4.25E − 05 3.39E − 03 3.43E − 01
MLE 2.95E − 05 2.05E − 03 1.98E − 01 2.89E − 05 2.13E − 03 2.01E − 01 2.64E − 05 2.11E − 03 2.73E − 01
ρ = 0.99
OLS 0.004043 0.408836 40.89173 0.004043 0.408836 40.89173 0.004043 0.408836 40.89173
LE 4.45E − 05 3.00E − 03 2.79E − 01 6.72E − 05 5.84E − 03 5.25E − 01 1.00E − 04 8.96E − 03 9.42E − 01
MLE 2.76E − 05 1.86E − 03 1.68E − 01 4.17E − 05 3.63E − 03 3.26E − 01 6.22E − 05 5.57E − 03 7.50E − 01

Estimated MSE values of the OLSE, TPE, and MTPE when n = 50.

Rho 0.85 0.9 0.99
d k Sigma OLSE TPE MTPE OLSE TPE MTPE OLSE TPE MTPE
0.01 0.01 0.01 3.46E − 05 2E − 05 9.13E − 06 7.76E − 05 4.14E − 05 1.9E − 05 0.004043 0.00212 0.000969
0.1 0.003874 0.001977 0.000181 0.007438 0.003827 0.00035 0.408836 0.22438 0.020514
1 0.333639 0.188943 0.001727 0.719922 0.382846 0.0035 40.89173 20.74408 0.189653
0.05 0.01 3.46E − 05 1.99E − 05 9.09E − 06 7.76E − 05 3.78E − 05 1.7E − 05 0.004043 0.000919 0.00042
0.1 0.003874 0.00189 0.000173 0.007438 0.003928 0.00036 0.408836 0.091505 0.008366
1 0.333639 0.190314 0.00174 0.719922 0.384937 0.00352 40.89173 8.784307 0.080311
0.1 0.01 3.46E − 05 1.86E − 05 8.52E − 06 7.76E − 05 3.73E − 05 1.7E − 05 0.004043 0.000535 0.000245
0.1 0.003874 0.002014 0.000184 0.007438 0.003826 0.00035 0.408836 0.052611 0.00481
1 0.333639 0.196492 0.001796 0.719922 0.367537 0.00336 40.89173 5.320063 0.048639

0.05 0.01 0.01 3.46E − 05 1.89E − 05 8.64E − 06 7.76E − 05 4.08E − 05 1.9E − 05 0.004043 0.002114 0.000966
0.1 0.003874 0.001943 0.000178 0.007438 0.003912 0.00036 0.408836 0.224209 0.020498
1 0.333639 0.205892 0.001882 0.719922 0.368826 0.00337 40.89173 20.64197 0.18872
0.05 0.01 3.46E − 05 1.98E − 05 9.03E − 06 7.76E − 05 3.78E − 05 1.7E − 05 0.004043 0.000961 0.000439
0.1 0.003874 0.001967 0.00018 0.007438 0.003769 0.00034 0.408836 0.09861 0.009015
1 0.333639 0.186682 0.001707 0.719922 0.397967 0.00364 40.89173 9.495418 0.086812
0.1 0.01 3.46E − 05 1.97E − 05 9.02E − 06 7.76E − 05 3.96E − 05 1.8E − 05 0.004043 0.000585 0.000267
0.1 0.003874 0.001951 0.000178 0.007438 0.003655 0.00033 0.408836 0.060326 0.005515
1 0.333639 0.205296 0.001877 0.719922 0.353004 0.00323 40.89173 5.839301 0.053386

0.1 0.01 0.01 3.46E − 05 1.84E − 05 8.4E − 06 7.76E − 05 3.69E − 05 1.7E − 05 0.004043 0.002045 0.000935
0.1 0.003874 0.001917 0.000175 0.007438 0.003819 0.00035 0.408836 0.209985 0.019198
1 0.333639 0.191645 0.001752 0.719922 0.385686 0.00353 40.89173 22.09732 0.202026
0.05 0.01 3.46E − 05 1.95E − 05 8.91E − 06 7.76E − 05 3.84E − 05 1.8E − 05 0.004043 0.001 0.000457
0.1 0.003874 0.001859 0.00017 0.007438 0.003834 0.00035 0.408836 0.102129 0.009337
1 0.333639 0.192989 0.001764 0.719922 0.365375 0.00334 40.89173 10.09752 0.092317
0.1 0.01 3.46E − 05 2.01E − 05 9.19E − 06 7.76E − 05 3.56E − 05 1.6E − 05 0.004043 0.000656 0.0003
0.1 0.003874 0.002062 0.000189 0.007438 0.003699 0.00034 0.408836 0.058421 0.005341
1 0.333639 0.18755 0.001715 0.719922 0.366358 0.00335 40.89173 6.172316 0.056431

Estimated MSE values of the OLSE, RRE, MRRE, LE, and MLE when n = 100.

ρ = 0.85
k = 0.01 k = 0.05 k = 0.1
σ  = 0.01 σ  = 0.1 σ  = 1 σ  = 0.01 σ  = 0.1 σ  = 1 σ  = 0.01 σ  = 0.1 σ  = 1
OLS 0.00578 0.54722 0.57255 0.00578 0.547223 0.57255 0.005784 0.547223 0.572553
RRE 1.77E − 05 0.00176 0.16786 1.77E − 05 0.001679 0.16908 1.66E − 05 0.001789 0.174567
MRRE 1.63E − 05 0.0017 0.07039 7.42E − 06 0.000254 0.07131 8.04E − 06 0.000848 0.066346
ρ = 0.9
OLS 9.19887 8.5034 8.50041 9.19887 8.5034 8.50041 9.198873 8.5034 8.500411
RRE 3.68E − 05 0.0034 0.34013 3.36E − 05 0.00349 0.34198 3.31E − 05 0.003399 0.326526
MRRE 2.76E − 05 2.55E − 03 2.55E − 01 2.52E − 05 2.62E − 03 2.56E − 01 2.48E − 05 2.55E − 03 2.45E − 01
ρ = 0.99
OLS 1.80265 171.595 169.891 1.80265 171.595 169.891 1.802645 171.595 169.8914
RRE 0.00188 0.19934 18.4294 0.00082 0.081294 7.80411 0.000476 0.046741 4.726424
MRRE 0.00117 0.12359 11.4262 0.00051 0.050402 4.83855 0.000295 0.028979 2.930383

Sigma ρ = 0.85
d = 0.01 d = 0.05 d = 0.1
0.01 0.1 1 0.01 0.1 1 0.01 0.1 1

OLSE 0.00578 0.54722 0.57255 0.00578 0.547223 0.57255 0.005784 0.547223 0.572553
LE 2.34E − 05 1.50E − 03 1.43E − 01 2.21E − 05 1.49E − 03 1.57E − 01 2.25E − 05 1.62E − 03 1.46E − 01
MLE 1.45E − 05 9.32E − 04 8.88E − 02 1.37E − 05 9.25E − 04 9.75E − 02 1.40E − 05 1.01E − 03 9.07E − 02
ρ = 0.9
OLSE 9.19887 8.5034 8.50041 9.19887 8.5034 8.50041 9.198873 8.5034 8.500411
LE 3.53E − 05 2.46E − 03 2.45E − 01 3.46E − 05 2.55E − 03 2.40E − 01 3.16E − 05 2.52E − 03 2.55E − 01
MLE 2.19E − 05 1.53E − 03 1.47E − 01 2.15E − 05 1.58E − 03 1.49E − 01 1.96E − 05 1.57E − 03 2.03E − 01
ρ = 0.99
OLSE 1.80265 171.595 169.891 1.80265 171.595 169.891 1.802645 171.595 169.8914
LE 3.31E − 05 2.23E − 03 2.08E − 01 4.99E − 05 4.34E − 03 3.90E − 01 7.44E − 05 6.66E − 03 7.00E − 01
MLE 2.05E − 05 1.39E − 03 1.25E − 01 3.10E − 05 2.70E − 03 2.42E − 01 4.62E − 05 4.14E − 03 5.57E − 01

Estimated MSE values of the OLSE, TPE, and MTPE when n = 100.

d k σ ρ = 0.85 ρ = 0.9 ρ = 0.99
OLSE TPE MTPE OLSE TPE MTPE OLSE TPE MTPE
0.01 0.01 0.01 5.78E − 03 1.77E − 05 8.11E − 06 9.20E + 00 3.68E − 05 1.68E − 05 1.80E + 00 1.88E − 03 8.61E − 04
0.1 5.47E − 01 1.76E − 03 1.61E − 04 8.50E + 00 3.40E − 03 3.11E − 04 1.72E + 02 1.99E − 01 1.82E − 02
1 5.73E − 01 1.68E − 01 1.54E − 03 8.50E + 00 3.40E − 01 3.11E − 03 1.70E + 02 1.84E + 01 1.68E − 01
0.05 0.01 5.78E − 03 1.77E − 05 8.08E − E − 06 9.20E + 00 3.36E − 05 1.54E − 05 1.80E + 00 8.16E − 04 3.73E − 04
0.1 5.47E − 01 1.68E − 03 1.53E − 04 8.50E + 00 3.49E − 03 3.19E − 04 1.72E + 02 8.13E − 02 7.43E − 03
1 5.73E − 01 1.69E − 01 1.55E − 03 8.50E + 00 3.42E − 01 3.13E − 03 1.70E + 02 7.80E + 00 7.13E − 02
0.1 0.01 5.78E − 03 1.66E − 05 7.57E − 06 9.20E + 00 3.31E − 05 1.51E − 05 1.80E + 00 4.76E − 04 2.17E − 04
0.1 5.47E − 01 1.79E − 03 1.64E − 04 8.50E + 00 3.40E − 03 3.11E − 04 1.72E + 02 4.67E − 02 4.27E − 03
1 5.73E − 01 1.75E − 01 1.60E − 03 8.50E + 00 3.27E − 01 2.99E − 03 1.70E + 02 4.73E + 00 4.32E − 02

0.05 0.01 0.01 5.78E − 03 1.68E − 05 7.68E − 06 9.20E + 00 3.63E − 05 1.66E − 05 1.80E + 00 1.88E − 03 8.58E − 04
0.1 5.47E − 01 1.73E − 03 1.58E − 04 8.50E + 00 3.48E − 03 3.18E − 04 1.72E + 02 1.99E − 01 1.82E − 02
1 5.73E − 01 1.83E − 01 1.67E − 03 8.50E + 00 3.28E − 01 3.00E − 03 1.70E + 02 1.83E + 01 1.68E − 01
0.05 0.01 5.78E − 03 1.76E − 05 8.03E − 06 9.20E + 00 3.36E − 05 1.54E − 05 1.80E + 00 8.54E − 04 3.90E − 04
0.1 5.47E − 01 1.75E − 03 1.60E − 04 8.50E + 00 3.35E − 03 3.06E − 04 1.72E + 02 8.76E − 02 8.01E − 03
1 5.73E − 01 1.66E − 01 1.52E − 03 8.50E + 00 3.54E − 01 3.23E − 03 1.70E + 02 8.44E + 00 7.71E − 02
0.1 0.01 5.78E − 03 1.75E − 05 8.01E − 06 9.20E + 00 3.52E − 05 1.61E − 05 1.80E + 00 5.20E − 04 2.38E − 04
0.1 5.47E − 01 1.73E − 03 1.58E − 04 8.50E + 00 3.25E − 03 2.97E − 04 1.72E + 02 5.36E − 02 4.90E − 03
1 5.73E − 01 1.82E − 01 1.67E − 03 8.50E + 00 3.14E − 01 2.87E − 03 1.70E + 02 5.19E + 00 4.74E − 02

0.1 0.01 0.01 5.78E − 03 1.63E − 05 7.46E − 06 9.20E + 00 3.28E − 05 1.50E − 05 1.80E + 00 1.82E − 03 8.30E − 04
0.1 5.47E − 01 1.70E − 03 1.56E − 04 8.50E + 00 3.39E − 03 3.10E − 04 1.72E + 02 1.87E − 01 1.71E − 02
1 5.73E − 01 1.70E − 01 1.56E − 03 8.50E + 00 3.43E − 01 3.13E − 03 1.70E + 02 1.96E + 01 1.79E − 01
0.05 0.01 5.78E − 03 1.73E − 05 7.91E − 06 9.20E + 00 3.41E − 05 1.56E − 05 1.80E + 00 8.88E − 04 4.06E − 04
0.1 5.47E − 01 1.65E − 03 1.51E − 04 8.50E + 00 3.41E − 03 3.11E − 04 1.72E + 02 9.07E − 02 8.30E − 03
1 5.73E − 01 1.71E − 01 1.57E − 03 8.50E + 00 3.25E − 01 2.97E − 03 1.70E + 02 8.97E + 00 8.20E − 02
0.1 0.01 5.78E − 03 1.79E − 05 8.16E − 06 9.20E + 00 3.16E − 05 1.44E − 05 1.80E + 00 5.83E − 04 2.66E − 04
0.1 5.47E − 01 1.83E − 03 1.68E − 04 8.50E + 00 3.29E − 03 3.00E − 04 1.72E + 02 5.19E − 02 4.75E − 03
1 5.73E − 01 1.67E − 01 1.52E − 03 8.50E + 00 3.25E − 01 2.98E − 03 1.70E + 02 5.48E + 00 5.01E − 02
6. Conclusions

In this article, we proposed a modified two-parameter estimator to overcome the multicollinearity problem in a linear regression model. Also, we established the superiority of this new estimator over other existing estimators in terms of matrix mean squared error criterion. This new estimator is considered to include the ordinary least squares estimator (OLSE), the ridge estimator (RRE), the Liu estimator (LE), the modified ridge estimator (MRE), and the modified Liu estimator (MLE) as special cases. Finally, a numerical example and a simulation study were conducted to illustrate the theoretical results. Results show that the performance of the proposed estimator (MTPE) is superior to others.

Data Availability

The data used to support the findings of this study are included in Table 1.

Disclosure

This manuscript is accepted for a poster session, available in the following link: http://www.isi2019.org/wp-content/uploads/2019/03/CPS-list-by-CPS-POSTER-by-CPS_Title-no-1-March-2019.pdf.

Conflicts of Interest

There are no conflicts of interest regarding the publication of this paper.

Hoerl A. E. Kennard R. W. Ridge regression: biased estimation for nonorthogonal problems Technometrics 1970 12 1 55 67 10.1080/00401706.1970.104886342-s2.0-84942484786 Stein C. Neyman J. Inadmissibility of the usual estimator for mean of multivariate normal distribution 1 Proceedings of the Third Berkley Symposium on Mathematical and Statistics Probability 1956 Berkeley, CA, USA 197 206 Massy W. F. Principal components regression in exploratory statistical research Journal of the American Statistical Association 1965 60 309 234 266 10.2307/2283149 Mayer L. S. Willke T. A. On biased estimation in linear models Technometrics 1973 15 3 497 508 10.2307/1266855 Swindel F. F. Good ridge estimators based on prior information Communications in Statistics‐Theory and Methods 1976 11 1065 1075 Liu K. A new class of biased estimate in linear regression Communications in Statistics-Theory and Methods 1993 22 393 402 10.1080/036109293088310272-s2.0-84948885189 Li Y. Yang H. A new Liu-type estimator in linear regression model Statistical Papers 2012 53 2 427 437 10.1007/s00362-010-0349-y2-s2.0-84860206665 Özkale M. R. Kaçiranlar S. The restricted and unrestricted two-parameter estimators Communications in Statistics-Theory and Methods 2007 36 15 2707 2725 10.1080/036109207013868772-s2.0-36048931993 Farebrother R. W. Further results on the mean square error of ridge regression Journal of the Royal Statistical Society: Series B (Methodological) 1976 38 3 248 250 10.1111/j.2517-6161.1976.tb01588.x Trenkler G. Toutenburg H. Mean squared error matrix comparisons between biased estimators - an overview of recent results Statistical Papers 1990 31 1 165 179 10.1007/bf029246872-s2.0-0011554905 Kibria B. M. G. Performance of some new ridge regression estimators Communications in Statistics - Simulation and Computation 2003 32 2 419 435 10.1081/sac-1200174992-s2.0-0038271902 Muniz G. Kibria B. M. G. On some ridge regression estimators: an Empirical Comparisons Communications in Statistics-Simulation and Computation 2009 38 3 621 630 10.1080/036109108025928382-s2.0-60849102485 Aslam M. Performance of Kibria’s method for the heteroscedastic ridge regression model: some Monte Carlo evidence Communications in Statistics-Simulation and Computation 2014 43 4 673 686 10.1080/03610918.2012.7121852-s2.0-84887122645 Dorugade A. V. New ridge parameters for ridge regression Journal of the Association of Arab Universities for Basic and Applied Sciences 2014 15 1 94 99 10.1016/j.jaubas.2013.03.0052-s2.0-84897571852 Kibria B. M. G. Banik S. Some ridge regression estimators and their performances Journal of Modern Applied Statistical Methods 2016 15 1 206 238 10.22237/jmasm/1462075860 Lukman A. F. Ayinde K. Review and classifications of the ridge parameter estimation techniques Hacetteppe Journal of Mathematics and Statistic 2016 46 113 1 10.15672/hjms.2018156712-s2.0-85032258507 Lukman A. F. Ayinde K. Ajiboye A. S. Monte Carlo study of some classification-based ridge parameter estimators Journal of Modern Applied Statistical Methods 2017 16 1 428 451 10.22237/jmasm/14935982402-s2.0-85020474590 Hoerl A. E. Kannard R. W. Baldwin K. F. Ridge regression:some simulations Communications in Statistics 1975 4 2 105 123 10.1080/03610917508548342 Eledum H. Zahri M. Relaxation method for two stages ridge regression estimator International Journal of Pure and Applied Mathematics 2013 85 4 653 667 10.12732/ijpam.v85i4.32-s2.0-84879829757 Lukman A. F. Osowole O. I. Ayinde K. Ayinde K. Two stage robust ridge method in a linear regression model Journal of Modern Applied Statistical Methods 2015 14 2 53 67 10.22237/jmasm/14463508202-s2.0-84951769169