Volume 46, pp. 424-446, 2017.

Sharp Ritz value estimates for restarted Krylov subspace iterations

Ming Zhou and Klaus Neymeyr

Abstract

Gradient iterations for the Rayleigh quotient are elementary methods for computing the smallest eigenvalues of a pair of symmetric and positive definite matrices. A considerable convergence acceleration can be achieved by preconditioning and by computing Rayleigh-Ritz approximations from subspaces of increasing dimensions. An example of the resulting Krylov subspace eigensolvers is the generalized Davidson method. Krylov subspace iterations can be restarted in order to limit their computer storage requirements. For the restarted Krylov subspace eigensolvers, a Chebyshev-type convergence estimate was presented by Knyazev in [Soviet J. Numer. Anal. Math. Modelling, 2 (1987), pp. 371–396]. This estimate has been generalized to arbitrary eigenvalue intervals in [SIAM J. Matrix Anal. Appl., 37 (2016), pp. 955–975]. The generalized Ritz value estimate is not sharp as it depends only on three eigenvalues. In the present paper we extend the latter analysis by generalizing the geometric approach from [SIAM J. Matrix Anal. Appl., 32 (2011), pp. 443–456] in order to derive a sharp Ritz value estimate for restarted Krylov subspace iterations.

Full Text (PDF) [1008 KB], BibTeX

Key words

Krylov subspace, Rayleigh quotient, Rayleigh-Ritz procedure, polynomial interpolation, multigrid, elliptic eigenvalue problem.

AMS subject classifications

65F15, 65N25

Links to the cited ETNA articles

[9]Vol. 7 (1998), pp. 104-123 Andrew V. Knyazev: Preconditioned eigensolvers - an oxymoron?
[11]Vol. 15 (2003), pp. 38-55 Andrew V. Knyazev and Klaus Neymeyr: Efficient solution of symmetric eigenvalue problems using multigrid preconditioners in the locally optimal block conjugate gradient method
[17]Vol. 41 (2014), pp. 93-108 Klaus Neymeyr and Ming Zhou: The block preconditioned steepest descent iteration for elliptic operator eigenvalue problems

< Back