Fast gradient methods for uniformly convex and weakly smooth problems

J Park - Advances in Computational Mathematics, 2022 - Springer
Advances in Computational Mathematics, 2022Springer
In this paper, acceleration of gradient methods for convex optimization problems with weak
levels of convexity and smoothness is considered. Starting from the universal fast gradient
method which was designed to be an optimal method for weakly smooth problems whose
gradients are Hölder continuous, its momentum is modified appropriately so that it can also
accommodate uniformly convex and weakly smooth problems. Different from the existing
works, fast gradient methods proposed in this paper do not use the restarting technique but …
Abstract
In this paper, acceleration of gradient methods for convex optimization problems with weak levels of convexity and smoothness is considered. Starting from the universal fast gradient method which was designed to be an optimal method for weakly smooth problems whose gradients are Hölder continuous, its momentum is modified appropriately so that it can also accommodate uniformly convex and weakly smooth problems. Different from the existing works, fast gradient methods proposed in this paper do not use the restarting technique but use momentums that are suitably designed to reflect both the uniform convexity and weak smoothness information of the target energy function. Both theoretical and numerical results that support the superiority of the proposed methods are presented.
Springer
以上显示的是最相近的搜索结果。 查看全部搜索结果