We present a strikingly simple proof that two rules are sufficient to automate gradient descent: 1) don't increase the stepsize too fast and 2) don't overstep the local curvature. No …
A De Marchi, A Themelis - Journal of Optimization Theory and Applications, 2022 - Springer
Composite optimization offers a powerful modeling tool for a variety of applications and is often numerically solved by means of proximal gradient methods. In this paper, we consider …
Tseng's algorithm finds a zero of the sum of a maximally monotone operator and a monotone continuous operator by evaluating the latter twice per iteration. In this paper, we …
In this paper, we explore two fundamental first-order algorithms in convex optimization, namely, gradient descent (GD) and proximal gradient method (ProxGD). Our focus is on …
In this paper, our interest is in investigating the monotone inclusion problems in the framework of real Hilbert spaces. To solve this problem, we propose a new modified forward …
K Kankam, N Pholasa… - Mathematical Methods in …, 2019 - Wiley Online Library
In optimization theory, convex minimization problems have been intensively investigated in the current literature due to its wide range in applications. A major and effective tool for …
The forward–backward algorithm is a splitting method for solving convex minimization problems of the sum of two objective functions. It has a great attention in optimization due to …
S Salzo - SIAM Journal on Optimization, 2017 - SIAM
We study the variable metric forward-backward splitting algorithm for convex minimization problems without the standard assumption of the Lipschitz continuity of the gradient. In this …
Y Malitsky - Optimization Methods and Software, 2018 - Taylor & Francis
The paper concerns with novel first-order methods for monotone variational inequalities. They use a very simple linesearch procedure that takes into account a local information of …