Block Broyden's methods for solving nonlinear equations

C Liu, C Chen, L Luo, J Lui - Advances in Neural …, 2023 - proceedings.neurips.cc
This paper studies quasi-Newton methods for solving nonlinear equations. We propose
block variants of both good and bad Broyden's methods, which enjoy explicit local …

Multiple greedy quasi-newton methods for saddle point problems

M Xiao, S Bo, Z Wu - … on Data-driven Optimization of Complex …, 2024 - ieeexplore.ieee.org
This paper introduces the Multiple Greedy Quasi-Newton (MGSR1-SP) method, a novel
approach to solving strongly-convex-strongly-concave (SCSC) saddle point problems. Our …

Online learning guided curvature approximation: A quasi-Newton method with global non-asymptotic superlinear convergence

R Jiang, Q Jin, A Mokhtari - The Thirty Sixth Annual …, 2023 - proceedings.mlr.press
Quasi-Newton algorithms are among the most popular iterative methods for solving
unconstrained minimization problems, largely due to their favorable superlinear …

Accelerated quasi-newton proximal extragradient: Faster rate for smooth convex optimization

R Jiang, A Mokhtari - Advances in Neural Information …, 2024 - proceedings.neurips.cc
In this paper, we propose an accelerated quasi-Newton proximal extragradient method for
solving unconstrained smooth convex optimization problems. With access only to the …

Quasi-Newton methods for saddle point problems

C Liu, L Luo - Advances in Neural Information Processing …, 2022 - proceedings.neurips.cc
This paper studies quasi-Newton methods for strongly-convex-strongly-concave saddle
point problems. We propose random Broyden family updates, which have explicit local …

Non-asymptotic Global Convergence Rates of BFGS with Exact Line Search

Q Jin, R Jiang, A Mokhtari - arXiv preprint arXiv:2404.01267, 2024 - arxiv.org
In this paper, we explore the non-asymptotic global convergence rates of the Broyden-
Fletcher-Goldfarb-Shanno (BFGS) method implemented with exact line search. Notably, due …

Incremental Quasi-Newton Methods with Faster Superlinear Convergence Rates

Z Liu, L Luo, BKH Low - Proceedings of the AAAI Conference on …, 2024 - ojs.aaai.org
We consider the finite-sum optimization problem, where each component function is strongly
convex and has Lipschitz continuous gradient and Hessian. The recently proposed …

Symmetric Rank- Methods

C Liu, C Chen, L Luo - arXiv preprint arXiv:2303.16188, 2023 - arxiv.org
This paper proposes a novel class of block quasi-Newton methods for convex optimization
which we call symmetric rank-$ k $(SR-$ k $) methods. Each iteration of SR-$ k …

Explicit superlinear convergence rates of Broyden's methods in nonlinear equations

D Lin, H Ye, Z Zhang - arXiv preprint arXiv:2109.01974, 2021 - arxiv.org
In this paper, we study the explicit superlinear convergence rates of quasi-Newton methods.
We particularly focus on the classical Broyden's method for solving nonlinear equations. We …

Partial-quasi-Newton methods: efficient algorithms for minimax optimization problems with unbalanced dimensionality

C Liu, S Bi, L Luo, JCS Lui - Proceedings of the 28th ACM SIGKDD …, 2022 - dl.acm.org
This paper studies the strongly-convex-strongly-concave minimax optimization with
unbalanced dimensionality. Such problems contain several popular applications in data …