New Preconditioners for Nonsymmetric Saddle Point Systems with Singular (1, 1) Block

We investigate the solution of large linear systems of saddle point type with singular (1, 1) block by preconditioned iterative methods and consider two parameterized block triangular preconditioners used with Krylov subspace methods which have the attractive property of improved eigenvalue clustering with increased ill-conditioning of the (1,1) block of the saddle point matrix, including the choice of the parameter. Meanwhile, we analyze the spectral characteristics of two preconditioners and give the optimal parameter in practice. Numerical experiments that validate the analysis are presented.


Introduction
We study preconditioners for general nonsingular linear systems of the type ∈  × , ,  ∈  × ,  ≥ . ( Such systems arise in a large number of applications, for example, the (linearized) Navier-Stokes equations and other physical problems with conservation laws as well as constrained optimization problems [1][2][3][4][5].
We start with augmentation block triangular preconditioners for the general system (1); see Section 2 for our assumptions.When  is nonsingular, results for the general system have been obtained before; for example, Murphy et al. [23,24] propose the block diagonal Schur complement preconditioner and the block triangular Schur complement preconditioner as follows: If defined, it has been shown that the preconditioned matrices (cf.[24]) are diagonalizable and have only three distinct eigenvalues 1, (1 ± √ 5)/2 and two distinct eigenvalues 1, −1, respectively.However, when  is singular, it cannot be inverted and the Schur complement does not exist.For symmetric saddle point systems, that is,  =  and  is symmetric, one possible way of dealing with the systems is by augmentation, that is, by replacing  with  +    −1 , where  is an  ×  symmetric positive definite weight matrix [4,7,9,14,17,18,[25][26][27].Recently, for symmetric saddle point systems with (1, 1) block that has a high nullity, Greif and Schötzau [25,26] studied the application of the following block diagonal preconditioner used with the MINRES solver for the nonsymmetric saddle point systems (1): They have shown that if the nullity of  is , which is the highest possible nullity, then the preconditioned matrix M−1 A has only two distinct eigenvalues 1 and −1.Thus, a preconditioned minimal residual Krylov subspace iterative method such as MINRES converges within two iterations.
In this paper, we propose two new block triangular preconditioners where  is a scalar, and  is an  ×  symmetric positive definite weight matrix.The remainder of this paper is organized as follows.In Section 2, we discuss two block triangular preconditioners for solving nonsymmetric saddle point systems, and algebraic properties are studied too.In Section 3, numerical experiments are provided to validate our analysis in Section 2. In Section 4, we draw some conclusions.

Block Triangular Preconditioners
We will adopt the general notation to represent the nonsymmetric saddle point matrix of (1).We assume that  is symmetric and positive semidefinite with nullity  and that  is of size  ×  and has full row rank.Note that the assumption that A is nonsingular implies that rank() = rank() = , which we use in our analysis below.
Lemma 1 (see [2,27]).The nonsymmetric saddle point matrix (7) is nonsingular if and only if the following conditions are satisfied: where   2 and   2 are bases of N() and N(), respectively.Obviously, the nonsingularity of Lemma 2 (see [2,27]).If the saddle point-type matrix A in (7) is nonsingular, then the rank of the matrix  is at least  − , and hence its nullity is at most .

Block Triangular Preconditioners A. We first consider the preconditioner
It is easy to see that the eigenvalues of the preconditioned matrix A −1 A satisfy the generalized eigenvalue problem The second block row gives V = −(1/) −1 , and substituting it into the first block equation gives Regardless of the choice of , we see that  = 1 with algebraic multiplicity  − .From the nullity of  it follows that there are  linearly independent null vectors of .For each such null vector, we can find two  values satisfying  2 −  + 1 = 0. Thus, we have each with algebraic multiplicity .The remaining 2( − ) eigenvalues satisfy as following: Therefore, we rewrite (16) as where −( 2 − )/( 2 −  + 1) = .Thus, we have ISRN Computational Mathematics 3 Since    −1  +  is assumed to be nonsingular, the matrix pencil    −1 + is regular (cf.[27]).This expression gives an explicit formula in terms of the generalized eigenvalues of ( 16) and can be used to identify the intervals in which the eigenvalues lie.
To illustrate this, we consider the case  = 1, which corresponds to setting the (1, 2) block of the preconditioner to be   .The preconditioned matrix A −1 A has  = 1 with multiplicity  − , and (1 ± √ 3)/2 each with multiplicity .By (18) we have It is worth noting that since  is typically highly singular, many of the generalized eigenvalues are large, in which case the corresponding eigenvalues  ± = (1 ± √ 3)/2 are bounded away from zero.Since the 2 eigenvalues  ± = (± √  2 − 4)/2 are unbounded as  goes to ∝, we conclude that  should be of moderate size.

Block Triangular Preconditioner
We next consider the preconditioner where  ̸ = 1 is a scale,  is an  ×  symmetric positive definite weight matrix, and  +    −1  is nonsingular.
The following theorem provides the spectrum results of the preconditioned matrix A −1  A.

Theorem 3.
Let A be nonsingular and let its (1, 1) block  be singular with nullity .Then  = 1 is an eigenvalue of A −1  A of geometric multiplicity , and  = 1/( − 1) is an eigenvalue of geometric multiplicity .The remaining − eigenvalues satisfy the relation where  are some  −  generalized eigenvalues of the following generalized eigenvalue problem: Proof.Suppose that  is an eigenvalue of A −1  A, whose eigenvector is (  , V  )  .Then we have Furthermore, it satisfies the generalized eigenvalue problem or As A  is nonsingular,  ̸ = 0.The second equality gives that V = (1/(1 − )) −1 , and substituting it into the first equality gives It is straightforward to see that any vector  ∈   satisfies (26) with  = 1, and thus  = 1 is an eigenvalue of A −1  A. We claim that the eigenvalue  = 1 has geometric multiplicity .
If  = 0, then we have the following corollary.

Corollary 4.
Let A be nonsingular and let its (1, 1) block  be singular with nullity .Then  = 1 is an eigenvalue of A −1 0 A of geometric multiplicity , and  = −1 is an eigenvalue of geometric multiplicity .The remaining − eigenvalues satisfy the relation where  are some  −  generalized eigenvalues of the following generalized eigenvalue problem: From Theorem 3 and Corollary 4 we know that the higher the nullity of  is, the stronger the clustered eigenvalues of A −1 0 A are.When the nullity of  is , its at most value (cf.Lemma 2), we have the following result.

Corollary 5.
Let A be nonsingular and let its (1, 1) block  be singular with nullity .Then the preconditioned matrix A −1 0 A is diagonalizable and has precisely two eigenvalues  = 1 of geometric multiplicity  and  = −1 of geometric multiplicity .

ISRN Computational Mathematics
If  = 2, we have the following corollary.Corollary 6.Let A be nonsingular and let its (1, 1) block  be singular with nullity .Then  = 1 is an eigenvalue of A −1 2 A of geometric multiplicity  + .The remaining  −  eigenvalues satisfy the relation where  are some  −  generalized eigenvalues of the following generalized eigenvalue problem: From Theorem 3 and Corollary 6, if the nullity of  is , we have the following corollary.

Corollary 7.
Let A be nonsingular and let its (1, 1) block  be singular with nullity .Then the preconditioned matrix A −1 2 A has only precisely one eigenvalue  = 1 of geometric multiplicity  + .Remark 8. From Corollaries 5 and 7, if the nullity of  is , then it is readily seen that the preconditioned matrix A −1 0 A has precisely two distinct eigenvalues and that the preconditioned matrix A −1 2 A has only precisely one eigenvalue.Thus, we know that any preconditioned Krylov subspace method such as GMRES terminates in at most two steps if roundoff errors are ignored.

Numerical Experiments
In this section we present numerical experiments to illustrate the performance of the two preconditioners when they are implemented exactly or approximately.
From these figures we can clearly see that the higher the nullity of the (1, 1) block, the more strongly the eigenvalues of the preconditioned matrices are clustered.Furthermore, we note that Figures 1 and 2 show that  = 1 has very high multiplicity when the nullity of the (1, 1) block   is .
Next, we study the iteration numbers and iteration time for Ã3 and Ã4 .Table 1 shows results for applying GMRES of block triangular preconditioners A. From Table 1, we can see that the preconditioned GMRES of A −1 A is more efficient when  = 2, and iteration numbers are slightly changed by the change of parameter  when the nullity of  is  (see Tables 1 and 2).
We next will use preconditioned GMRES to solve, respectively, the following four saddle point-type systems: where the right-hand side   is taken such that the exact solution  is equal to (1, 1 . . ., 1)  .The stopping criterion is    ‖ () ‖/‖  ‖ ≤ 10 −6 and  () is the residual vector after th iteration.In this section, let  = 2; we give some comparison results of the iteration numbers and iteration time for five preconditioners (see Tables 2 and 3).From Table 2, when the nullity of  is 3/4, we see that the preconditioned GMRES with the preconditioner A is more efficient than that of the preconditioner T, but from Table 3, when the nullity of  is , we see that the preconditioned GMRES with the preconditioner A is essentially similar to the preconditioner T in iteration time and iteration numbers.
All the numerical experiments were performed with MATLAB 7.0.The machine we have used is a PC-AMD, CPU T7400 2.2 GHz process.

Conclusions
In this paper, we have analyzed the spectral properties as well as the computational performance of two types of block triangular augmentation preconditioners for saddle pointtype matrices with a highly singular (1, 1) block.Complete theoretical analysis shows that all eigenvalues of the preconditioned matrices are strongly clustered.A good parameter choice may substantially reduce the iteration numbers and iteration time.In particulare, we have shown that in cases where the (1, 1) block has high nullity, convergence for each of the two preconditioned GMRES iterative methods is guaranteed to be almost immediate.Numerical experiments are also reported for illustrating the efficiency of the presented preconditioners.From the views of theories and applications, the presented preconditioners are better than the previous results.

Table 1 :
Iteration time and iteration numbers of GMRES of block triangular preconditioner A for different  with  +  = 1200.

Table 2 :
Iteration time and iteration numbers of GMRES of preconditioned matrices T −1 A and A −1 A for different  = 2 with nullity = 3/4.

Table 3 :
Iteration time and iteration numbers of GMRES of preconditioned matrices T −1 A and A −1 A for different  = 2 with nullity = .