Accelerated additive Schwarz methods for convex optimization with adaptive restart

J Park - Journal of Scientific Computing, 2021 - Springer
Journal of Scientific Computing, 2021Springer
Based on an observation that additive Schwarz methods for general convex optimization
can be interpreted as gradient methods, we propose an acceleration scheme for additive
Schwarz methods. Adopting acceleration techniques developed for gradient methods such
as momentum and adaptive restarting, the convergence rate of additive Schwarz methods is
greatly improved. The proposed acceleration scheme does not require any a priori
information on the levels of smoothness and sharpness of a target energy functional, so that …
Abstract
Based on an observation that additive Schwarz methods for general convex optimization can be interpreted as gradient methods, we propose an acceleration scheme for additive Schwarz methods. Adopting acceleration techniques developed for gradient methods such as momentum and adaptive restarting, the convergence rate of additive Schwarz methods is greatly improved. The proposed acceleration scheme does not require any a priori information on the levels of smoothness and sharpness of a target energy functional, so that it can be applied to various convex optimization problems. Numerical results for linear elliptic problems, nonlinear elliptic problems, nonsmooth problems, and nonsharp problems are provided to highlight the superiority and the broad applicability of the proposed scheme.
Springer
以上显示的是最相近的搜索结果。 查看全部搜索结果