Additive Schwarz methods for convex optimization as gradient methods

J Park - SIAM Journal on Numerical Analysis, 2020 - SIAM
SIAM Journal on Numerical Analysis, 2020SIAM
This paper gives a unified convergence analysis of additive Schwarz methods for general
convex optimization problems. Resembling the fact that additive Schwarz methods for linear
problems are preconditioned Richardson methods, we prove that additive Schwarz methods
for general convex optimization are in fact gradient methods. Then an abstract framework for
convergence analysis of additive Schwarz methods is proposed. The proposed framework
applied to linear elliptic problems agrees with the classical theory. We present applications …
This paper gives a unified convergence analysis of additive Schwarz methods for general convex optimization problems. Resembling the fact that additive Schwarz methods for linear problems are preconditioned Richardson methods, we prove that additive Schwarz methods for general convex optimization are in fact gradient methods. Then an abstract framework for convergence analysis of additive Schwarz methods is proposed. The proposed framework applied to linear elliptic problems agrees with the classical theory. We present applications of the proposed framework to various interesting convex optimization problems such as nonlinear elliptic problems, nonsmooth problems, and nonsharp problems.
Society for Industrial and Applied Mathematics
以上显示的是最相近的搜索结果。 查看全部搜索结果