[图书][B] Variational analysis and applications

BS Mordukhovich - 2018 - Springer
Boris S. Mordukhovich Page 1 Springer Monographs in Mathematics Boris S. Mordukhovich
Variational Analysis and Applications Page 2 Springer Monographs in Mathematics Editors-in-Chief …

Error bounds, quadratic growth, and linear convergence of proximal methods

D Drusvyatskiy, AS Lewis - Mathematics of Operations …, 2018 - pubsonline.informs.org
The proximal gradient algorithm for minimizing the sum of a smooth and nonsmooth convex
function often converges linearly even without strong convexity. One common reason is that …

Fast convergence to non-isolated minima: four equivalent conditions for functions

Q Rebjock, N Boumal - Mathematical Programming, 2024 - Springer
Optimization algorithms can see their local convergence rates deteriorate when the Hessian
at the optimum is singular. These singularities are inescapable when the optima are non …

Rsg: Beating subgradient method without smoothness and strong convexity

T Yang, Q Lin - Journal of Machine Learning Research, 2018 - jmlr.org
In this paper, we study the efficiency of a Restarted SubGradient (RSG) method that
periodically restarts the standard subgradient method (SG). We show that, when applied to a …

A unified analysis of descent sequences in weakly convex optimization, including convergence rates for bundle methods

F Atenas, C Sagastizábal, PJS Silva, M Solodov - SIAM Journal on …, 2023 - SIAM
We present a framework for analyzing convergence and local rates of convergence of a
class of descent algorithms, assuming the objective function is weakly convex. The …

A globally convergent proximal Newton-type method in nonsmooth convex optimization

BS Mordukhovich, X Yuan, S Zeng, J Zhang - Mathematical Programming, 2023 - Springer
The paper proposes and justifies a new algorithm of the proximal Newton type to solve a
broad class of nonsmooth composite convex optimization problems without strong convexity …

The proximal point method revisited

D Drusvyatskiy - arXiv preprint arXiv:1712.06038, 2017 - arxiv.org
In this short survey, I revisit the role of the proximal point method in large scale optimization. I
focus on three recent examples: a proximally guided subgradient method for weakly convex …

Globally convergent coderivative-based generalized Newton methods in nonsmooth optimization

PD Khanh, BS Mordukhovich, VT Phat… - Mathematical …, 2024 - Springer
This paper proposes and justifies two globally convergent Newton-type methods to solve
unconstrained and constrained problems of nonsmooth optimization by using tools of …

Nonsmooth optimization using Taylor-like models: error bounds, convergence, and termination criteria

D Drusvyatskiy, AD Ioffe, AS Lewis - Mathematical Programming, 2021 - Springer
We consider optimization algorithms that successively minimize simple Taylor-like models of
the objective function. Methods of Gauss–Newton type for minimizing the composition of a …

Variational convexity of functions and variational sufficiency in optimization

PD Khanh, BS Mordukhovich, VT Phat - SIAM Journal on Optimization, 2023 - SIAM
The paper is devoted to the study, characterizations, and applications of variational
convexity of functions, the property that has been recently introduced by Rockafellar together …