A Review of multilayer extreme learning machine neural networks

JA Vásquez-Coronel, M Mora, K Vilches - Artificial Intelligence Review, 2023 - Springer
Abstract The Extreme Learning Machine is a single-hidden-layer feedforward learning
algorithm, which has been successfully applied in regression and classification problems in …

Adaptive gradient descent without descent

Y Malitsky, K Mishchenko - arXiv preprint arXiv:1910.09529, 2019 - arxiv.org
We present a strikingly simple proof that two rules are sufficient to automate gradient
descent: 1) don't increase the stepsize too fast and 2) don't overstep the local curvature. No …

Proximal gradient algorithms under local Lipschitz gradient continuity: A convergence and robustness analysis of PANOC

A De Marchi, A Themelis - Journal of Optimization Theory and Applications, 2022 - Springer
Composite optimization offers a powerful modeling tool for a variety of applications and is
often numerically solved by means of proximal gradient methods. In this paper, we consider …

Forward-backward-half forward algorithm for solving monotone inclusions

LM Briceno-Arias, D Davis - SIAM Journal on Optimization, 2018 - SIAM
Tseng's algorithm finds a zero of the sum of a maximally monotone operator and a
monotone continuous operator by evaluating the latter twice per iteration. In this paper, we …

Adaptive proximal gradient method for convex optimization

Y Malitsky, K Mishchenko - arXiv preprint arXiv:2308.02261, 2023 - arxiv.org
In this paper, we explore two fundamental first-order algorithms in convex optimization,
namely, gradient descent (GD) and proximal gradient method (ProxGD). Our focus is on …

Strong convergence of a forward–backward splitting method with a new step size for solving monotone inclusions

DV Thong, P Cholamjiak - Computational and Applied Mathematics, 2019 - Springer
In this paper, our interest is in investigating the monotone inclusion problems in the
framework of real Hilbert spaces. To solve this problem, we propose a new modified forward …

On convergence and complexity of the modified forward‐backward method involving new linesearches for convex minimization

K Kankam, N Pholasa… - Mathematical Methods in …, 2019 - Wiley Online Library
In optimization theory, convex minimization problems have been intensively investigated in
the current literature due to its wide range in applications. A major and effective tool for …

Novel forward–backward algorithms for optimization and applications to compressive sensing and image inpainting

S Suantai, MA Noor, K Kankam… - Advances in Difference …, 2021 - Springer
The forward–backward algorithm is a splitting method for solving convex minimization
problems of the sum of two objective functions. It has a great attention in optimization due to …

The variable metric forward-backward splitting algorithm under mild differentiability assumptions

S Salzo - SIAM Journal on Optimization, 2017 - SIAM
We study the variable metric forward-backward splitting algorithm for convex minimization
problems without the standard assumption of the Lipschitz continuity of the gradient. In this …

Proximal extrapolated gradient methods for variational inequalities

Y Malitsky - Optimization Methods and Software, 2018 - Taylor & Francis
The paper concerns with novel first-order methods for monotone variational inequalities.
They use a very simple linesearch procedure that takes into account a local information of …