Additive Schwarz methods for convex optimization with backtracking

J Park - Computers & Mathematics with Applications, 2022 - Elsevier
Computers & Mathematics with Applications, 2022Elsevier
This paper presents a novel backtracking strategy for additive Schwarz methods for general
convex optimization problems as an acceleration scheme. The proposed backtracking
strategy is independent of local solvers, so that it can be applied to any algorithms that can
be represented in an abstract framework of additive Schwarz methods. Allowing for adaptive
increasing and decreasing of the step size along the iterations, the convergence rate of an
algorithm is improved. The improved convergence rate of the algorithm is analyzed …
Abstract
This paper presents a novel backtracking strategy for additive Schwarz methods for general convex optimization problems as an acceleration scheme. The proposed backtracking strategy is independent of local solvers, so that it can be applied to any algorithms that can be represented in an abstract framework of additive Schwarz methods. Allowing for adaptive increasing and decreasing of the step size along the iterations, the convergence rate of an algorithm is improved. The improved convergence rate of the algorithm is analyzed rigorously. In addition, combining the proposed backtracking strategy with a momentum acceleration technique, we propose a further accelerated additive Schwarz method. Numerical results for various convex optimization problems such as nonlinear elliptic problems, nonsmooth problems, and nonsharp problems are presented in order to support our theory.
Elsevier
以上显示的是最相近的搜索结果。 查看全部搜索结果