In this paper we show that the reduction in residual norm at each iteration of CG and GMRES is related to the first column of the inverse of an upper Hessenberg matrix that is obtained from the original coefficient matrix by way of an orthogonal transformation. The orthogonal transformation itself is uniquely defined by the coefficient matrix of the equations and the initial vector of residuals. We then apply this analysis to MINRES and show that, under certain circumstances, this algorithm can exhibit an unusual (and very slow) type of convergence that we refer to as oscillatory convergence.
(2001). On the convergence of krylov linear equation solvers [journal article - articolo]. In OPTIMIZATION METHODS & SOFTWARE. Retrieved from http://hdl.handle.net/10446/138871
On the convergence of krylov linear equation solvers
Vespucci, Maria Teresa
2001-01-01
Abstract
In this paper we show that the reduction in residual norm at each iteration of CG and GMRES is related to the first column of the inverse of an upper Hessenberg matrix that is obtained from the original coefficient matrix by way of an orthogonal transformation. The orthogonal transformation itself is uniquely defined by the coefficient matrix of the equations and the initial vector of residuals. We then apply this analysis to MINRES and show that, under certain circumstances, this algorithm can exhibit an unusual (and very slow) type of convergence that we refer to as oscillatory convergence.Pubblicazioni consigliate
Aisberg ©2008 Servizi bibliotecari, Università degli studi di Bergamo | Terms of use/Condizioni di utilizzo