Abstract: If the system of linear equations is large, instead of using a direct solution it can be solved iteratively. Individual iteration can treated be as an inverse problem by defining an objective function, and updating the solution estimate based on the gradient of the objective function. This chapter defines different gradient-based methods such as Steepest descent method, Conjugate gradient method, Biconjugate gradient method and Subspace gradient method. The steepest descent method chooses the residual vector as the updating direction. The conjugate gradient method is applicable to weakly nonlinear problems, with an objective function which is presented in non-quadratic forms but is approximately quadratic near the minimum. The biconjugate gradient method can be easily extended for solving a complex-valued system. The subspace method partitions the steepest ascent vector into several independent subvectors and chooses an optimal step length for each of them.
Publication Year: 2016
Publication Date: 2016-11-14
Language: en
Type: other
Indexed In: ['crossref']
Access and Citation
Cited By Count: 4
AI Researcher Chatbot
Get quick answers to your questions about the article from our AI researcher chatbot