## Services on Demand

## Article

## Indicators

- Cited by SciELO
- Access statistics

## Related links

- Similars in SciELO

## Share

## Computational & Applied Mathematics

*On-line version* ISSN 1807-0302

### Comput. Appl. Math. vol.31 no.2 São Carlos 2012

#### http://dx.doi.org/10.1590/S1807-03022012000200007

**An alternating LHSS preconditioner for saddle point problems **

**Liu Qingbing ^{I,II*} **

^{I}Department of Mathematics, Zhejiang Wanli University, Ningbo, Zhejiang, 315100, P.R. China

^{II}Department of Mathematics, East China Normal University, Shanghai, 200241, P.R. China. E-mail: lqb_2008@hotmail.com

**ABSTRACT**

In this paper, we present a new alternating local Hermitian and skew-Hermitian splitting preconditioner for solving saddle point problems. The spectral property of the preconditioned matrices is studies in detail. Theoretical results show all eigenvalues of the preconditioned matrices will generate two tight clusters, one is near (0, 0) and the other is near (2, 0) as the iteration parameter tends to zero from positive. Numerical experiments are given to validate the performances of the preconditioner.

**Mathematical suject classification:** Primary: 65F10; Secondary: 65F50.

**Key words:** saddle point problems, matrix splitting, preconditioner, eigenvalue distribution.

**1 Introduction**

We study a new preconditioner for saddle point systems of the type

where *A* ∈ *R ^{n}*

^{×n}is a positive real matrix, that is, the matrix

*H*= (

*A*+

*A*)/2, the symmetric part of

^{T}*A*, is positive definite,

*B*∈

*R*

^{m}^{×n}with

*m*

__<__

*n*has full row rank. Such linear systems arise in a large number of scientific and engineering applications (see for instance [6, 13, 17]). As such systems are typically large and sparse, solution by iterative methods can be found in the literature, such as Uzawa-type schemes [6], splitting methods [2, 3, 15], iterative projection methods [17], iterative null space methods [6, 17] etc. To improve the convergence of rate of iterative methods, preconditioning techniques have been studied and many effective preconditioners have been employed for solving linear systems of the form (1) [2-8, 11-14, 16, 18].

Recently, based on the Hermitian and skew-Hermitian splitting of the saddle point matrix, a general alternating preconditioner for generalized saddle point problems was analyzed in [5]. Bai et al. [1] further generalized HSS to positive-definite and skew-Hermitian splitting (PSS), Normal and skew-Hermitian splitting (NSS) and considered preconditioners based on the splitting. Pan et al. [14] proposed two preconditioners for the saddle point problem (1), using the HS splitting and PS splitting of the (1,1) blocks *A*, not based on use of the coefficient matrix as a preconditioner for Krylov subspace methods. Peng and Li [15] considered a kind of the alternating-direction iterative method which is based on the block triangular splitting of the coefficient matrix, and its preconditioned version was established in [16]. In [16], the alternating preconditioner was further studied as a preconditioner of some Krylov subspace methods for the saddle point problems.

In this paper, we propose a new alternating local Hermitian and skew-Hermitian splitting preconditioner for the saddle point problem (1) based on the HS splitting of the (1,1) blocks *A*. We mainly focus on the case that A is positive real matrix with the symmetric part. We first establish a new alternating-direction iterative method for the saddle point problem (1) and then give a new alternating local Hermitian and skew-Hermitian splitting preconditioner in Section 2, and spectral properties of the preconditioned matrix are discussed in detail. Numerical experiments are presented in Section. In the final section, we draw some conclusions.

**2 The new preconditioner and its spectral properties**

From now on, we will adopt the general notation

to represent the nonsymmetric saddle point matrix of equation (1). We assume that *A* is positive real, and that *B* is of size *m* ×*n* and has full row rank.

Let = []+[], where

with

*2.1 The preconditioner*

Analogously to the classical ADI method [19], we consider the following two splittings of :

where α > 0 is a parameter and *I* is the identity matrix. By iterating alternatively between this two splittings, we obtain a new algorithm as follows:

where *u*^{0} is an initial guess. By eliminating the intermediate vector , we have the iteration in fixed point form as

where

and

It is easy to know that there is a unique splitting =_{α}- _{α}, with _{α} nonsingular, which induces the iteration matrix _{α}, i.e.,

where

Hence, the linear system *u* = **b** is equivalent to the linear system

Recently, for generalized linear systems, the estimates for the spectral radius of with the HSS preconditioner _{α} with *H* = *A* have been studied in [16]. However, if we use a Krylov subspace method such as GMRES or its restarted variant to approximate the solution of this system of linear equations, _{α} can be considered as a new preconditioner to the saddle point problems (1). Under the assumption that *A* is positive real, analogously to the proof in [11] we can show that all eigenvalues of the preconditioned matrices will generate two tight clusters, one is near (0, 0) and the other is near (2, 0) as the iteration parameter tends to zero from positive.

*2.2 Spectral properties*

It is well known that characterizing the rate of convergence of nonsymmetric preconditioned iterations can be a difficult task. In particular, eigenvalue information alone may not be sufficient to give meaningful estimates of the convergence rate of a method like preconditioned GMRES [6, 17]. Nevertheless, experience shows that for many linear systems arising in practice, a well-clustered spectrum (away from zero) usually results in rapid convergence of the preconditioned iteration. Now we consider the eigenvalue problem associated with the preconditioned matrix , i.e.,

where (*λ*, *x*) is any eigenpair of . It is easy to know that λ ≠ 0 fromnonsingular. Then we have

which leads to

First of all, it must hold *u* ≠ 0. Otherwise, it follows from (6) that either *λ* = 0 or *ν* = 0 holds. In fact, neither of them can be true.

If *ν* = 0, then from (5) we have 2*αAu* = *λ*(*α*^{2}*u* + *αAu* + *HSu*). Multiplying both sides of this equality from left with *u**, we have

If *u***HSu* = 0, from (7), it is easy to see that *λ *→ 2 as *α *→ 0_{+}. If *u***HSu* ≠ 0, from (7), we have that *λ *→ 0 as *α *→ 0_{+}.

We now assume *ν* ≠ 0, without loss of generality, we further assume ||*ν*|| = 1 and substitute (6) to (5), we obtain

Multiplying the above equality from left hand by *ν**, we obtain

Let

Then (8) can be rewritten as

For simplicity, we denote that *δ* = *αb*_{1 }+ *b*_{2 }+ *α*^{3}. Subsequently, we will mainly discuss the two cases, that is to say, *δ* = 0 and *δ* ≠ 0.

**Case I. ** *δ* = 0.

It is not difficult to know that 4*αb*_{1 }+ 2*b*_{2} ≠ 0 from the existence eigenvalue of the preconditioned matrix. From (9), we have

Since *δ* = *αb*_{1 }+ *b*_{2} + *α*^{3} = 0, we have

Substituting (11) to (10), we have

If *b*_{1} = 0, from (12), it is easy to know that *λ* = 0. If *b*_{1} ≠ 0, we have that *λ *→ 2 as *α *→ 0_{+}.

**Case II. ** *δ* ≠ 0. Note that

then the two roots of the quadratic equation (9) are

We will mainly discuss in the following two cases:

1) If

b_{2}= 0, then we have1.1) If

b_{1}= 0, then we have thatλ_{±}= 0.

1.2) Ifb_{1}≠ 0, then we have thatλ→ 2 asα→ 0_{+}.2) If

b_{2}≠ 0, then we have

Up to now, we have shown that the eigenvalues of the preconditioned matrix will converge to either the origin or the point (2, 0) as *α *→ 0_{+}. This means that if *α* is small enough, then the eigenvalues of the preconditioned matrix will gather into two cluster, one is near (0, 0) and another is near (2, 0).

From the above discussing, we have the following theorem.

**Theorem 2.1. ** *Let **be positive real and B has full row rank. x = *(*u*, ν**)** is an eigenvector of , then for sufficiently small α > 0, the eigenvalue of have the following cases:*

(I)

If ν =0, then the eigenvalue of will gather into(0, 0)or(2, 0)as α → 0(II)_{+}.

If ν ≠ 0, then the eigenvalue of will gather into two clusters, one is near(0, 0)and another is near(2, 0)as α → 0_{+}.

**3 Numerical experiments**

In this section, we present our numerical experiments to illustrate the eigenvalue distribution of the preconditioned matrix and to assess our statement that the eigenvalues of will be gathering into two clusters as *α* becomes small.

We consider the saddle point-type matrix of the following form:

where the sub-matrices *A* = *νA*_{1} + *N*, *N* has only two diagonal lines of nonzero, which start from the 2nd and the *n*th colomns, i.e.,

and *A*_{1}, *B* are taken from [2], i.e.,

and

with being the Kronecker product symbol and the discretization meshsize. The right vectors are defined as

For this example, the matrix *A* is nonsymmetric and positive real.

Theorem 1 shows that the eigenvalue of will gather into two clusters, one is near (0, 0) and another is near (2, 0) as α → 0_{+}. We plot the spectra of the coefficient matrix with *ν* = 1 and *ν* = 0.01 from left column to right column in Figure 1, and the spectra of the preconditioned matrices corresponding to *ν* = 1 and different values of α in Figures 2-4, where the left column corresponding to the preconditioner and the right column corresponding to the preconditioner . From Figures 2-4 we can see that the eigenvalues of both kinds of preconditioned matrices become more and more clustered as *α* becomes smaller.

All the numerical experiments were performed with MATLAB 7.0. The machine we have used is a PC-AMD, CPU T7400 2.2GHz process. The GMRES method is used to solve the above test problem. The initial guess is taken to be *x*^{(0)} = 0 and the stopping criterion is chosen as

In Tables 1-2, we list the iteration numbers of GMRES and the preconditioned GMRES when they are applied to solve the test problem, where the numbers outside (inside) of the brackets denote outer iteration numbers (inner iteration numbers) of GMRES method, respectively.

Here we test the performance of two preconditioners, one is the alternating local Hermitian and skew-Hermitian splitting preconditioner, and another is the preconditioner in [16] which is defined as follows:

with

From Tables 1-2, we can see that the preconditioner will improve the convergence of the GMRES iteration efficiently, especially when *ν* is large, and the preconditioner is more efficient when ν is small. We also can see that the outer iteration numbers of the preconditioned GMRES with the preconditioner is larger than that with the preconditioner , and the inner iteration numbers of the preconditioned GMRES with the preconditioner is smaller than that with the preconditioner .

**Remark. ** We have also shown that for small *α*, all the eigenvalues fall in two clusters, one near 0 and the other near 2. Indeed, our analysis suggests that the 'best' value of *α* should be small enough so that the spectrum is clustered, but not so small that the preconditioned matrix is close to being singular. This instability has been observed in [20], for example in Figure 6. It is interesting to observe that for any choice of *α* the LHSS method shows a significant reduction at some specific iteration but it tends to stagnate before and after that iteration.

**4 Conclusions**

In this paper, we present a new alternating local Hermitian and skew-Hermitian splitting preconditioner for solving saddle point problems. The spectral property of the preconditioned matrices is studies in detail. Theoretical results show all eigenvalues of the preconditioned matrices will generate two tight clusters, one is near (0, 0) and the other is near (2, 0) as the iteration parameter tends to zero from positive. Numerical experiments are given to validate the performances of the preconditioner. However, each step of an outer iteration for solving the preconditioned linear system requires the solution of an inner linear system whose coefficient matrix is . Therefore, convergence of the outer iteration is fast if the eigenvalues of the preconditioned matrix are clustered, but careful attention must be paid to the conditioning and eigenvalue distribution of the matrix itself, which determine the speed of convergence of the inner iteration [10]. Therefore, how to reduce the outer and inner iteration numbers for such problems remains an extensive discussion.

**Acknowledgments. ** The author is grateful to the anonymous referees for their helpful suggestions which improve this paper.

**References**

[1] Z.Z. Bai, G.H. Golub and M.K. Ng, *Hermitian and skew-Hermitian splitting methods for non-Hermitian positive definite linear systems*. SIAM J. Matrix Anal. Appl., **24** (2003), 603-626. [ Links ]

[2] Z.Z. Bai, G.H. Golub and J.Y. Pan, *Preconditioned Hermitian and skew-Hermitian splitting methods for non-Hermitian positive semidefinite linear systems*. Numer. Math., **98** (2004), 1-32. [ Links ]

[3] Z.Z. Bai, G.H. Golub, L.L. Zhang and J.F. Yin, *Block triangular and skew-Hermitian splitting methods for positive-definite linear systems*. SIAM J. Sci. Comput., **26** (2005), 844-863. [ Links ]

[4] Z.Z. Bai, M.K. Ng and Z.Q.Wang, *Constraint preconditioners for symmetric indefinite matrices*. SIAM J. Matrix Anal. Appl., **31** (2009), 410-433. [ Links ]

[5] M. Benzi and G.H. Golub, *A preconditioner for generalized saddle point problems*. SIAM J. Matrix Anal. Appl., **26** (2004), 20-41. [ Links ]

[6] M. Benzi, G.H. Golub and J. Liesen, *Numerical solution of saddle point problems*. Acta Numerica., **14** (2005), 1-137. [ Links ]

[7] M. Benzi and J. Liu, *Block preconditioning for saddle point systems with indefinite (1,1) block*. Int. J. Comput. Math., **84** (2007), 1117-1129. [ Links ]

[8] Z.H. Cao, *Augmentation block preconditioners for saddle point-type matrices with singular (1,1) blocks*. Numer. Linear Algebra Appl., **15** (2008), 515-533. [ Links ]

[9] G.H. Golub and C. Greif, *On solving block-structured indefinite linear systems*. SIAM J. Sci. Comput., **24** (2003), 2076-2092. [ Links ]

[10] C. Greif and M.L. Overton, *An Analysis of Low-Rank Modifications of Preconditioners for Saddle Point Systems*. Electron. Trans. Numer. Anal., **37** (2010), 307-320. [ Links ]

[11] T.Z. Huang, S.L. Wu and C.X. Li, *The spectral properties of the Hermitian and skew-Hermitian splitting preconditioner for generalized saddle point problems*. J. Comput. Appl. Math., **229** (2009), 37-46. [ Links ]

[12] H.S.Dollar and A.J. Wathen, *Approximate factorization constraint preconditioners for saddle point matrices*. SIAM J. Sci. Comput., **27** (2006), 1555-1572. [ Links ]

[13] H.C. Elman, D.J. Silvester and A.J. Wathen, *Performance and analysis of saddle point preconditioners for the discrete stead-state Navier-Stokes equations*. Numer. Math., **90** (2002), 665-688. [ Links ]

[14] J.Y. Pan, M.K. Ng and Z.Z. Bai, *New preconditioners for saddle point problems*. Appl. Math. Comput., **172** (2006), 762-771. [ Links ]

[15] X.F. Peng and W. Li, *The alternating-direction iterative method for saddle point problems*. Appl. Math. Comput., **216** (2010), 1845-1858. [ Links ]

[16] X.F. Peng and W. Li, * An alternating preconditioner for saddle point problems*. Appl. Math. Comput., **234** (2010), 3411-3423. [ Links ]

[17] Y. Saad, *Iterative Methods for Sparse Linear Systems*. SIAM, Philadelphia, PA (2003). [ Links ]

[18] V. Simoncini, *Block triangular preconditioners for symmetric saddle-point problems*. Appl. Numer. Math., **49** (2004), 63-80. [ Links ]

[19] D.W. Peaceman and J.H.H. Rachford, *The numerical solution of parabolic and elliptic differential equations*. Journal of the Society for Industrial and Applied Mathematics, **3** (1955), 28-41. [ Links ]

[20] L.M. Hernández, *Alternating oblique projections for coupled linear systems*. Numerical Algorithms., **38** (2005), 285-303. [ Links ]

Received: 30/V/11.

Accepted: 26/VII/11.

#CAM-376/11.

*Author supported by National Natural Science Foundation of China (No. 11071079) and Ningbo Natural Science Foundation (2010A610097, 2012A610037).