This paper presents a full rank factorization of a 2×2 block matrix without any restriction concerning the group inverse. Applying this factorization, we obtain an explicit representation of the group inverse in terms of four individual blocks of the partitioned matrix without certain restriction. We also derive some important coincidence theorems, including the expressions of the group inverse with Banachiewicz-Schur forms.
1. Introduction
Let ℂm×n denote the set of all m×n complex matrices. We use R(A), N(A), and r(A) to denote the range, the null space, and the rank of a matrix A, respectively. The Moore-Penrose inverse of a matrix A∈ℂm×n is a matrix X∈ℂn×m which satisfies
(1)(1)AXA=A(2)XAX=X(3)(AX)*=AX(4)(XA)*=XA.
The Moore-Penrose inverse of A is unique, and it is denoted by A†.
Recall that the group inverse of A is the unique matrix X∈ℂm×m satisfying
(2)AXA=A,XAX=X,AX=XA.
The matrix X is called the group inverse of A and it is denoted by A#.
Partitioned matrices are very useful in investigating various properties of generalized inverses and hence can be widely used in the matrix theory and have many other applications (see [1–4]). There are various useful ways to write a matrix as the product of two or three other matrices that have special properties. For example, linear algebra texts relate Gaussian elimination to the LU factorization and the Gram-Schmidt process to the QR factorization. In this paper, we consider a factorization based on the full rank factorization of a matrix. Our purpose is to provide an integrated theoretical development of and setting for understanding a number of topics in linear algebra, such as the Moore-Penrose inverse and the group inverse.
A full rank factorization of A is in the form
(3)A=FAGA,
where FA is of full column rank and GA is of full row rank. Any choice in (3) is acceptable throughout the paper, although this factorization is not unique.
For a complex matrix 𝒜 of the form
(4)𝒜=[ABCD]∈ℂ(m+s)×(n+t),
in the case when m=n and A is invertible, the Schur complement of A in 𝒜 is defined by S=D-CA-1B. Sometimes, we denote the Schur complement of A in 𝒜 by (𝒜/A). Similarly, if s=t and D is invertible, then the Schur complement of D in 𝒜 is defined by T=A-BD-1C.
In the case when A is not invertible, the generalized Schur complement of A in 𝒜 is defined by
(5)S=D-CA†B.
Similarly, the generalized Schur complement of D in 𝒜 is defined by
(6)T=A-BD†C.
The Schur complement and generalized Schur complement have quite important applications in the matrix theory, statistics, numerical analysis, applied mathematics, and so forth.
There are a great deal of works [5–8] for the representations of the generalized inverse of 𝒜. Various other generalized inverses have also been researched by a lot of researchers, for example, Burns et al. [6], Marsaglia and Styan [8], Benítez and Thome [9], Cvetković-Ilić et al. [10], Miao [11], Chen et al., and so forth [12] and the references therein. The concept of a group inverse has numerous applications in matrix theory, from convergence to Markov chains and from generalized inverses to matrix equations. Furthermore, the group inverse of block matrix has many applications in singular differential equations, Markov chains iterative methods, and so forth [13–17]. Some results for the group inverse of a 2×2 block matrix (operator) can be found in [18–30]. Most works in the literature concerning representations for the group inverses of partitioned matrices were carried out under certain restrictions on their blocks. Very recently, Yan [31] obtained an explicit representation of the Moore-Penrose inverse in terms of four individual blocks of the partitioned matrix by using the full rank factorization without any restriction. This motivates us to investigate the representations of the group inverse without certain restrictions.
In this paper, we aimed at a new method in giving the representation of the group inverse for the fact that there is no known representation for 𝒜#, 𝒜D with A, B, C, and D arbitrarily. The outline of our paper is as follows. In Section 2, we first present a full rank factorization of 𝒜 using previous results by Marsaglia and Styan [8]. Inspired by this factorization, we extend the analysis to obtain an explicit representation of the group inverse of 𝒜 without any restriction. Furthermore, we discuss variants special forms with the corresponding consequences, including Banachiewicz-Schur forms and some other extensions as well.
2. Representation of the Group Inverse: General Case
Yan [31] initially considered the representation of the Moore-Penrose inverse of the partitioned matrix by using the full rank factorization technique. The following result is borrowed from [31, Theorem 2.2].
For convenience, we first state some notations which will be helpful throughout the paper:
(7)Pα=I-αα-,Qα=I-α-α,whereα-∈α{1},(8)S=D-CA†B,E=PAB,W=CQA,R=PWSQE.
Let A, E, W, R have the full rank factorizations
(9)A=FAGA,E=FEGE,W=FWGW,R=FRGR,
respectively; then there is a full rank factorization of the block matrix 𝒜:
(10)𝒜=FG=[FA00FECGA†FRFWPWSGE†][GAFA†B0GRGWFW†S0GE].
Now, the Moore-Penrose inverse of 𝒜 can be expressed as 𝒜†=G†F†. In particular, when A is group inverse, let S=D-CA#B; then the full rank factorization of 𝒜 is
(11)𝒜=[FA00FECA#FAFRFWPWSGE†][GAGAA#B0GRGWFW†S0GE].
This motivates us to obtain some new results concerning the group inverse by using the full rank factorization related to the group inverse.
Recall that if a matrix A∈ℂn×n is group inverse (which is true when ind(A)=1), then A# can be expressed in terms of A{1}; that is,
(12)A#=A(A(1))3A.
Particularly, we have
(13)A#=A(A†)3A.
The following result follows by using [31, Theorem 3.6] and (13).
Theorem 1.
Let 𝒜 be defined by (4); then the group inverse of 𝒜 can be expressed as
(14)𝒜#=[ABCD]×[(V5+V5V3V2*V1V2-V4V1V2)V3×[W†00E†](U3U5+U2*U1U2U5-U2*U1U4)+(V4-V5V3V2*)V1[A†00R†]×U1(U4-U2U5)]3[ABCD],
where
(15)F1=[FA†00FR†],F2=[FW†00FE†],G1=[GA†*00GR†*],G2=[GW†*00GE†*],U1=[X3-1-X3-1HPWX2-1X4-X4*X2-1PWH*X3-1X4+X4*X2-1PWH*X3-1HPWX2-1X4],U2=[HHPWH1*X1-1IPWH1*X1-1],U3=[I00X1-1],U4=[IH0I],U5=[0WW†EE†H1PW],V1=[Y3-1-Y3-1KQEY2-1Y4-Y4Y2-1QEK*Y3-1Y4+Y4Y2-1QEK*Y3-1KQEY2-1Y4],V2=[KK1*KK1*0],V3=[Y1-1-Y1-1K1-K1*Y1-1I+K1*Y1-1K1],V4=[I0K*I],V5=[W†W0K1*E†E],
with
(16)H=A†*C*,H1=E†*S*,K=A†B,K1=W†S,X1=I+H1PWH1*,X2=I+PWH1*H1PW,X3=I+HPW(X2-1-X2-1X4X2-1)PWH*,X4=(RR†X2-1RR†)†,Y1=I+K1QEK1*,Y2=I+QEK1*K1QE,Y3=I+KQE(Y2-1-Y2-1Y4Y2-1)QEK*,Y4=(R†RY2-1R†R)†.
If the (1,1)-element A of 𝒜 is group inverse, we immediately have Theorem 2 by using the full rank factorization of (11).
Theorem 2.
Let 𝒜 be defined by (4). Suppose A is group inverse; then the group inverse of 𝒜 can be expressed as
(17)𝒜#=[ABCD]×[(V5+V5V3V2*V1V2-V4V1V2)V3×[W†00E†](U3U5+U2*U1U2U5-U2*U1U4)+(V4-V5V3V2*)V1[A†00R†]×U1(U4-U2U5)]3[ABCD],
where H=A#*C*, K=A#B, and H1, K1, U1, U2, U3, U4, U5, V1, V2, V3, V4, V5, X1, X2, X3, X4, Y1, Y2, Y3, Y4 are the same as those in Theorem 1.
The two representations of F†, G† (which can be found in [31, Theorem 3.1]),
(18)F†=[F1U1U4-F1U1U2U5-F2U*U1U4+F2(U3+U2*U1U2)U5],(19)G†=[V4V1G1*-V5V3V2*V1G1*,-V4V1V2V3G2*+V5(V3+V3V2*V1V2V3)G2*],
will be helpful in the proofs of the following results.
Theorem 3.
Let 𝒜 be defined by (4); then the following statements are true.
If E is of full column rank and W is of full row rank, then
(20)𝒜#=[ABCD]×([A†000]+[I-K-K10I][W†00E†]×[-H*II0])3[ABCD].
If E=0, W=0, then
(21)𝒜#=[ABCD]×([Y~-Y~KQSK*Y~I-QSK*Y~K][A†00S†]×[X~X~HPS-H*X~I-H*X~HPS])3×[ABCD],
where X~=(I+HPSH*)-1 and Y~=(I+KQSK*)-1.
Proof.
(a) If E is full row rank, then QE=0, and hence R=0, X1=I, X2=I, X3=I, and X4=0. Thus, V1, V2, V3, V4, V5 defined in Theorem 1 can be simplified to
(22)V1=[I000],V2=[KK1*KK1*0],V3=[I-K1-K1*I+K1*K1],V4=[I0K*I],V5=[W†W0K1*I],
which imply
(23)V4V1=[I0K*0],V2V3=[0KK1*-K1*K],V1V2V3=[0K00],V5V3V2*V1=[00K*0],V4V1V2V3=[0K0K*K],V5V3=[W†W-K10I],V5V3V2*V1V2V3=[000K*K].
So, (19) is reduced to
(24)G†=[II-K-K100I][GA†000GW†000GE†].
When W is full row rank, one gets PW=0 which implies R=0, X1=I, X2=I, X3=I, and X4=0. Thus,
(25)U1=[I00I],U2=[H0I0],U3=[I00I],U4=[IH0I],U5=[0IEE†0].
Simple computations show that
(26)U1U4=[IH0I],U1U2U5=[0H0I],U2*U1U4=[H*I+H*H00],U3U5=[0IEE†0],U2*U1U2U5=[0I+H*H00].
Now, F† possesses the following form according to (18):
(27)F†=[FA†000FW†000FE†][I0-H*II0].
Since
(28)𝒜†=G†F†=[A†000]+[I-K-K10I]×[W†00E†][-H*II0],
one gets the expression of 𝒜# by using (13):
(29)𝒜#=[ABCD]×([A†000]+[I-K-K10I][W†00E†]×[-H*II0])3[ABCD].
(b) If E=0, then H1=0, X1=I, and X2=I such that X3=I+HPWPRPWH* and X4=RR†. Letting X=X3-1, then
(30)U1=[X-XHPWRR†-RR†PWH*XRR†+RR†PWH*XHPWRR†],U2=[H0I0],U3=[I00I],U4=[IH0I],U4=[0WW†00].
By short computations, one gets
(31)U1U4=[XXH(I-PWRR†)-RR†PWH*XRR†-RR†PWH*XH(I-PWRR†)],U1U2U5=[0XH(I-PWRR†)WW†0RR†WW†-RR†PWH*XH(I-PWRR†)WW†],U2*U1U4=[(I-RR†PW)H*XRR†+(I-RR†PW)H*XH(I-PWRR†)00],U3U5=[0WW†00],U2*U1U2U5=[0RR†WW†+(I-RR†PW)H*XH(I-PWRR†)00].
Hence,
(32)F†=[FA†000FR†000FW†]×[XXH(I-PWRR†)PW-PWH*XPW-PWH*XH(I-PWRR†)PW(I-RR†PW)H*XI-RR†PW].
If W=0, then K1=0, Y1=I, and Y2=I such that Y3=I+KQEQRQEK* and Y4=R†R. Letting Y=Y3-1, then
(33)V1=[Y-YKQER†R-R†RQEK*YR†R+R†RQEK*YKQER†R],V2=[0K00],V3=[I00I],V4=[I0K*I],V5=[000E†E],
which imply
(34)V4V1=[Y-YKQER†RK*Y-R†RQEK*YR†R-(I-R†RQE)K*YKQER†R],V1V2V3=[0YK0-R†RQEK*YK],V5V3V2*V1=[00E†EK*Y-E†EK*YKQER†R],V4V1V2V3=[0YK0K*YK-R†RQEK*YK],V5V3V2*V1V2V3=[000K*YK-R†RQEK*YK].
So, (19) is reduced to
(35)G†=[Y-YKQE-YKQRQEK*YI-QRQEK*YKI]×[GA†000GR†000GE†].
Then,
(36)𝒜†=G†F†=[Y~-Y~KQSK*Y~I-QSK*Y~K]×[A†00S†][X~X~HPS-H*X~I-H*X~HPS].
Therefore, we have
(37)𝒜#=[ABCD]([Y~-Y~KQSK*Y~I-QSK*Y~K]×[A†00S†][X~X~HPS-H*X~I-H*X~HPS])3×[ABCD].
Theorem 4.
Let 𝒜 be defined by (4), then the following statements are true.
(a) If E=0, W=0, and R(C)⊂R(S), then
(38)𝒜#=[ABCD]×([Y~-Y~KQSK*Y~I-QSK*Y~K][A†00S†]×[I0-H*I])3[ABCD],
where Y~=(I+KQSK*)-1.
(b) If E=0, W=0, and R(B*)⊂R(S*), then
(39)𝒜#=[ABCD]([I-K0I][A†00S†]×[X~X~HPS-H*X~I-H*X~HPS])3[ABCD],
where X~=(I+HPSH*)-1.
(c) If E=0, W=0, and R(B)⊂R(A), R(C)⊂R(S), R(B*)⊂R(S*), R(C*)⊂R(A*), then
(40)𝒜#=[A(A†)3A+AA†KS†H*A†A-AA†K(S†)2S-S(S†)2H*A†AS(S†)3S].
Proof.
(a) Since E=0 and W=0, by Theorem 3(b), one gets
(41)𝒜†=[Y~-Y~KQSK*Y~I-QSK*Y~K][A†00S†]×[X~X~HPS-H*X~I-H*X~HPS].
Since R(C)⊂R(S), that is, PSC=0, then X~=I, then the equality previously mentioned is simplified to
(42)𝒜†=[Y~-Y~KQSK*Y~I-QSK*Y~K][A†00S†][I0-H*I].
By using 𝒜#=𝒜(𝒜†)3𝒜, we have
(43)𝒜#=[ABCD]×([Y~-Y~KQSK*Y~I-QSK*Y~K][A†00S†][I0-H*I])3×[ABCD].
(b) Similarly to the proof of (a).
(c) Since E=0 and W=0, by Theorem 2(b), one gets
(44)𝒜†=[Y~-Y~KQSK*Y~I-QSK*Y~K][A†00S†]×[X~X~HPS-H*X~I-H*X~HPS].
Since R(B)⊂R(A), R(C)⊂R(S), R(B*)⊂R(S*), R(C*)⊂R(A*), that is, PAB=0, CQA=0, PSC=0, BQS=0, then the previous equality is simplified to
(45)𝒜†=[I-K0I][A†00S†][I0-H*I]=[A†+KS†H*-KS†-S†H*S†].
Moreover,
(46)𝒜𝒜†=[ABCD][A†+KS†H*-KS†-S†H*S†]=[AA†+AKS†H*-BS†H*-AKS†+BS†CA†+CKS†H*-DS†H*-CKS†+DS†]=[AA†00SS†],𝒜†𝒜=[A†+KS†H*-KS†-S†H*S†][ABCD]=[A†A+KS†H*A-KS†CK+KS†H*B-KS†D-S†H*A+S†C-S†H*B+S†D]=[A†A00S†S].
Therefore,
(47)𝒜#=𝒜𝒜†𝒜†𝒜†𝒜=[AA†00SS†]×[A†+KS†H*-KS†-S†H*S†][A†A00S†S]=[A(A†)3A+AA†KS†H*A†A-AA†K(S†)2S-S(S†)2H*A†AS(S†)3S].
Theorem 5.
Let 𝒜 be defined by (4); let S=D-CA#B be the Schur complement of D in 𝒜; then the following statements are true.
(a) If A and S are group inverse, PAB=0, CQA=0, and PSC=0, then
(48)𝒜#=[A#+A#BS#CA#A#(I+BS#CA#)A#BPS-A#BS#-S#CA#S#(I-C(A#)2BPS)].
(b) If A and S are group inverse, PAB=0, CQA=0, BPS=0, then
(49)𝒜#=[A#+A#BS#CA#-A#BS#PSCA#(I+A#BS#C)A#-S#CA#(I-PSC(A#)2B)S#]
(c) Let A and S be group inverse; then PAB=0, CQA=0, PSC=0, and BPS=0 if and only if
(50)𝒜#=[A#+A#BS#CA#-A#BS#-S#CA#S#].
Proof.
(i) If PAB=0 and CQA=0, then E, W, R defined in (8) can be simplified to E=0, W=0; R=S and then there is a full rank factorization
(51)𝒜=[ABCD]=FG=[FA0CA#FAFS][GAGAA#B0GS]
according to (11). Thus,
(52)GF=[GAFA+GAKCA#FAGAKFSGSHFAGSFS],
where H=CA# and K=A#B. Denote by S′ the Schur complement of GSFS in the partitioned matrix GF. Then,
(53)S′=GAFA+GAKHFA-GAKFS(GSFS)-1GSHFA=GAFA+GAKHFA-GAKSS#HFA=GAFA+GAKPSHFA=GAFA+GAKPSCA#FA=GAFA.
Applying the Banachiewicz-Schur formula, we have
(54)(GAFA)-1=[(GAFA)-1-(GAFA)-1GAKFS(GSFS)-1-(GSFS)-1GSHFA(GAFA)-1(GSFS)-1(I+GSHFA(GAFA)-1GAKFS(GSFS)-1)]=[(GAFA)-1-GAA#KS#FS-GSS#HA#FA(GSFS)-1+GSS#HKS#FS].
Simple computations give
(55)F(GF)-1=[FA0HFAFS]×[(GAFA)-1-GAA#KS#FS-GSS#HA#FA(GSFS)-1+GSS#HKS#FS]=[A#FA-KS#FS0S#FS],(GF)-1G=[(GAFA)-1-GAA#KS#FS-GSS#HA#FA(GSFS)-1+GSS#HKS#FS]×[GAGAK0GS]=[GAA#GAA#KPS-GSS#HGSS#-GSS#HKPS].
Then,
(56)𝒜#=F(GF)-2G=[A#FA-KS#FS0S#FS][GAA#GAA#KPS-GSS#HGSS#-GSS#HKPS]=[A#+A#BS#CA#A#(I+BS#CA#)A#BPS-A#BS#-S#CA#S#(I-C(A#)2BPS)].
(b) Since PAB=0 and CQA=0, similar as (a), there is a full rank factorization of 𝒜 such that
(57)𝒜=FG=[FA0CA#FAFS][GAGAA#B0GS].
We also have
(58)GF=[GAFA+GAKCA#FAGAKFSGSHFAGSFS].
By using BPS=0, one gets the Schur complement of GSFS in GF:
(59)S′=GAFA+GAA#BPSHFA=GAFA.
Hence,
(60)(GF)-1=[(GAFA)-1-GAA#KS#FS-GSS#HA#FA(GSFS)-1+GSS#HKS#FS].
Short computations show that
(61)F(GF)-1=[FA0HFAFS]×[(GAFA)-1-GAA#KS#FS-GSS#HA#FA(GSFS)-1+GSS#HKS#FS],=[A#FA-KS#FSPSHA#FA-PSHKS#FS+S#FS],(GF)-1G=[(GAFA)-1-GAA#KS#FS-GSS#HA#FA(GSFS)-1+GSS#HKS#FS]×[GAGAK0GS]=[GAA#0-GSS#HGSS#].
Therefore,
(62)𝒜#=F(GF)-2G=[A#FA-KS#FSPSHA#FA-PSHKS#FS+S#FS][GAA#0-GSS#HGSS#]=[A#+A#BS#CA#-A#BS#PSCA#(I+A#BS#C)A#-S#CA#(I-PSC(A#)2B)S#].
(c) (⇒:) Since PAB=0, CQA=0, PSC=0, and BPS=0, according to the proof of (a) and (b), we have
(63)F(GF)-1=[FA0HFAFS]×[(GAFA)-1-GAA#KS#FS-GSS#HA#FA(GSFS)-1+GSS#HKS#FS]=[A#FA-KS#FS0S#FS],(GF)-1G=[(GAFA)-1-GAA#KSS#-GSS#HA#FA(GSFS)-1+GSS#HKS#FS]×[GAGAK0GS]=[GAA#0-GSS#HGSS#].
Hence,
(64)𝒜#=F(GF)-1(GF)-1G=[A#+A#BS#CA#-A#BS#-S#CA#S#].
(⇐:) By [9, Theorem 2].
Analogous to Theorem 5, if define T=A-BD#C the Schur complement of A in 𝒜, one can obtain the following results.
Theorem 6.
Let 𝒜 be defined by (4); let T=A-BD#C be the Schur complement of A in 𝒜; then the following statements are true.
(a) If D and T are group inverse, PDC=0, BQD=0, PTB=0, then
(65)𝒜#=[T#(I-B(D#)2CQT)-T#BD#-D#CT#+D#(I+CT#BD#)D#CQTD#+D#CT#BD#].
(b) If D and T are group inverse, PDC=0, BQD=0, and CQT=0, then
(66)𝒜#=[(I-PTB(D#)2C)T#-T#BD#+PTBD#(I+D#CT#B)D#-D#CT#D#+D#CT#BD#].
(c) Let D and T be group inverse; then PDC=0, BQD=0, CQT=0, and PTB=0 if and only if
(67)𝒜#=[T#-T#BD#-D#CT#D#+D#CT#BD#].
Proof.
The proof is similar to the proof of Theorem 5.
Combining Theorems 5 and 6, we have the following results.
Theorem 7.
Let 𝒜 be defined by (4); let S=D-CA#B, T=A-BD#C be the Schur complement of D and A in 𝒜, respectively. Then the following statements are true.
(a) If A,S,D,Tare group inverse, PAB=0, CQA=0,PSC=0,PDC=0,BQD=0, and PTB=0, then
(68)𝒜#=[A#+A#BS#CA#A#(I+BS#CA#)A#BPS-A#BS#-S#CA#S#(I-C(A#)2BPS)]𝒜#=[T#(I-B(D#)2CQT)-T#BD#-D#CT#+D#(I+CT#BD#)D#CQTD#+D#CT#BD#].
(b) If A, S, D, T are group inverse, PAB=0, CQA=0, PSC=0, PDC=0, BQD=0, and CQT=0, then
(69)𝒜#=[A#+A#BS#CA#A#(I+BS#CA#)A#BPS-A#BS#-S#CA#S#(I-C(A#)2BPS)]𝒜#=[(I-PTB(D#)2C)T#-T#BD#+PTBD#(I+D#CT#B)D#-D#CT#D#+D#CT#BD#].
(c) If A, S, D, T are group inverse, PAB=0, CQA=0, BPS=0, PDC=0, BQD=0, and PTB=0, then
(70)𝒜#=[A#+A#BS#CA#-A#BS#PSCA#(I+A#BS#C)A#-S#CA#(I-PSC(A#)2B)S#]𝒜#=[T#(I-B(D#)2CQT)-T#BD#-D#CT#+D#(I+CT#BD#)D#CQTD#+D#CT#BD#].
(d) If A, S, D, T are group inverse, PAB=0, CQA=0, BPS=0, PDC=0, BQD=0, and CQT=0, then
(71)𝒜#=[A#+A#BS#CA#-A#BS#PSCA#(I+A#BS#C)A#-S#CA#(I-PSC(A#)2B)S#]𝒜#=[(I-PTB(D#)2C)T#-T#BD#+PTBD#(I+D#CT#B)D#-D#CT#D#+D#CT#BD#].
Theorem 8.
Let 𝒜 be defined by (4); let S=D-CA#B, T=A-BD#C be the Schur complement of D and A in 𝒜, respectively. Then
(72)𝒜#=[A#+A#BS#CA#-A#BS#-S#CA#S#]=[T#-T#BD#-D#CT#D#+D#CT#BD#]
if and only if one of the following conditions holds
(73)(a)PAB=0,PDC=0,PSC=0,CQA=0,BQD=0,BQS=0,(74)(b)PAB=0,PDC=0,PTB=0,CQA=0,BQD=0,CQT=0.
Proof.
(a) Using Theorem 6(c) and Theorem 7(c), we conclude that
(75)PAB=0,CQA=0,PSC=0,BQS=0,PDC=0,BQD=0,CQT=0,PTB=0,
if and only if
(76)𝒜#=[A#+A#BS#CA#-A#BS#-S#CA#S#]=[T#-T#BD#-D#CT#D#+D#CT#BD#].
Now, we only need to prove (73) is equivalent to (75). Denote T′=A#+A#BS#CA#. Then
(77)TT′=(A-BD#C)(A#+A#BS#CA#)=AA#+AA#BS#CA#-BD#CA#-BD#CA#BS#CA#=AA#+BS#CA#-BD#CA#-BD#(D-S)S#CA#=AA#,T′T=(A#+A#BS#CA#)(A-BD#C)=A#A-A#BD#C-A#BS#CA#A-A#BS#CA#BD#C=A#A-A#BD#C-A#BS#C-A#BS#(D-S)D#C=A#A.
Moreover, we have
(78)TT′T=AA#(A-BD#C)=A-BD#C=T,T′TT′=A#A(A#+A#BS#CA#)=A#+A#BS#CA#=T′.
Thus, T′=T#. Hence, T#T=A#A and TT#=AA#. Now, we get PAB=PTB=0 and CQA=CQT=0, which means (73) implying (75). Obviously, (75) implies (73). So, (73) is equivalent to (75).
(b) The proof is similar to (a).
3. Applications to the Solution of a Linear System
In this section, we will give an application of the previous results above to the solution of a linear system. Using generalized Schur complement, we can split a larger system into two small linear systems by the following steps.
Let
(79)𝒜x=y
be a linear system. Applying the block Gaussian elimination to the system, we have
(80)[AB0D-CA†B][x1x2]=[y1y2-CA†y1].
Hence, we get
(81)Ax1+Bx2=y1;Sx2=y2-CA†y1.
That is,
(82)Ax1=y1-Bx2,Sx2=y2-CA†y1.
Now, the solution of system (79) can be obtained by the two small linear systems previously mentioned. In that case, the operation can be significantly simplified. We will also notice that the Moore-Penrose inverse of A can be replaced by other generalized inverses, such as the group inverse, the Drazin inverse and generalized inverse of A or even the ordinary inverse A-1.
In the following, we will give the group inverse solutions of the linear system.
Theorem 9.
Let
(83)𝒜x=y
be a linear system. Suppose 𝒜 satisfies all the conditions of Theorem 5 (c), partitioning x and y as
(84)x=[x1x2],y=[y1y2]
which have appropriate sizes with 𝒜. If y∈R(𝒜), then the solution x=𝒜#y of linear system (79) can be expressed as
(85)x1=A#(y1-Bx2),x2=S#(y2-CA#y1),
where S=D-CA#B.
Proof.
Since y∈R(𝒜), we conclude that x=𝒜#y is the solution of linear system (79). By Theorem 5 (c), we can get the following:
(86)x=𝒜#y=[A#+A#BS#CA#-A#BS#-S#CA#S#][y1y2]=[A#y1+A#BS#CA#y1-A#BS#y2S#(y2-CA#y1)]=[x1x2].
Now, it is easy to see that the solution x=𝒜#y can be expressed as
(87)x1=A#(y1-Bx2),x2=S#(y2-CA#y1),
which are also the group inverse solutions of the two small linear systems of (82), respectively.
Acknowledgments
This work was supported by the National Natural Science Foundation of China (11061005) and the Ministry of Education, Science, and Grants (HCIC201103) of Guangxi Key Laboratory of Hybrid Computational and IC Design Analysis Open Fund.
HallF. J.Generalized inverses of a bordered matrix of operators197529152163MR037264310.1137/0129015ZBL0324.47007HallF. J.The Moore-Penrose inverse of particular bordered matrices1979274467478MR54266210.1017/S144678870001346XZBL0412.15004HallF. J.HartwigR. E.Further results on generalized inverses of partitioned matrices1976304617624MR040428810.1137/0130056ZBL0345.15005MitraS. K.KallianpurG.Properties of the fundamental bordered matrix used in linear estimation1982New York, NY, USANorth Holland504509BaksalaryJ. K.StyanG. P. H.Generalized inverses of partitioned matrices in Banachiewicz-Schur form2002354414710.1016/S0024-3795(02)00334-8MR1927646ZBL1022.15006BurnsF.CarlsonD.HaynsworthE.MarkhamT.Generalized inverse formulas using the Schur complement197426254259MR033018110.1137/0126022ZBL0284.15004Cvetković-IlićD. S.A note on the representation for the Drazin inverse of 2×2 block matrices2008429124224810.1016/j.laa.2008.02.019MR2419153ZBL1148.15001MarsagliaG.StyanG. P. H.Rank conditions for generalized inverses of partitioned matrices1974364437442MR0384827ZBL0309.15002BenítezJ.ThomeN.The generalized Schur complement in group inverses and (k+1)-potent matrices200654640541310.1080/03081080500348709MR2259598Cvetković-IlićD. S.ChenJ.XuZ.Explicit representations of the Drazin inverse of block matrix and modified matrix200957435536410.1080/03081080701772830MR2522847ZBL1176.15006MiaoJ. M.General expressions for the Moore-Penrose inverse of a 2×2 block matrix199115111510.1016/0024-3795(91)90351-VMR1102137ZBL0723.15004ChenJ.XuZ.WeiY.Representations for the Drazin inverse of the sum P+Q+R+S and its applications2009430143845410.1016/j.laa.2008.08.007MR2460529ZBL1190.15004CampbellS. L.Meyer,C. D.Jr.RoseN. J.Applications of the Drazin inverse to linear systems of differential equations with singular constant coefficients1976313411425MR043163610.1137/0131035ZBL0341.34001HartwigR.LiX.WeiY.Representations for the Drazin inverse of a 2×2 block matrix200527375777110.1137/040606685MR2208333da Silva SoaresA.LatoucheG.The group inverse of finite homogeneous QBD processes200218115917110.1081/STM-120002779MR1888290ZBL1005.60093WeiY.DiaoH.On group inverse of singular Toeplitz matrices200539910912310.1016/j.laa.2004.08.021MR2152412ZBL1072.15007WeiY.On the perturbation of the group inverse and oblique projection1999981294210.1016/S0096-3003(97)10151-5MR1654048ZBL0927.15004BenítezJ.LiuX.ZhuT.Additive results for the group inverse in an algebra with applications to block operators201159327928910.1080/03081080903410262MR2774083ZBL1219.15003BuC.LiM.ZhangK.ZhengL.Group inverse for the block matrices with an invertible subblock2009215113213910.1016/j.amc.2009.04.054MR2568313ZBL1181.15003BuC.ZhangK.ZhaoJ.Some results on the group inverse of the block matrix with a sub-block of linear combination or product combination of matrices over skew fields2010587-895796610.1080/03081080903092243MR2742328ZBL1204.15010BuC.ZhaoJ.ZhangK.Some results on group inverses of block matrices over skew fields200918117125MR2482069BuC.ZhaoJ.ZhengJ.Group inverse for a class 2×2 block matrices over skew fields20082041454910.1016/j.amc.2008.05.145MR2458337ZBL1159.15003CaoC. G.Some results of group inverses for partitioned matrices over skew fields200118357MR1867625ZBL1076.15506ChenX.HartwigR. E.The group inverse of a triangular matrix1996237/2389710810.1016/0024-3795(95)00561-7MR1382666ZBL0851.15005CaoC.LiJ.Group inverses for matrices over a Bezout domain200918600612MR2565875ZBL1189.15005CaoC.LiJ.A note on the group inverse of some 2×2 block matrices over skew fields201121724102711027710.1016/j.amc.2011.05.027MR2806414ZBL1222.15004CaoC.TangX.Representations of the group inverse of some 2×2 block matrices20063115111517CatralM.OleskyD. D.van den DriesscheP.Graphical description of group inverses of certain bipartite matrices20104321365210.1016/j.laa.2009.07.025MR2566457ZBL1184.15004PatrícioP.HartwigR. E.The (2,2,0) group inverse problem2010217251652010.1016/j.amc.2010.05.084MR2678563ZBL1204.15014ZhouJ.BuC.WeiY.Group inverse for block matrices and some related sign analysis201260666968110.1080/03081087.2011.625498MR2929177ZBL1246.15009YanZ.New representations of the Moore-Penrose inverse of 2×2 block matrices201310.1016/j.laa.2012.08.014