Modified conjugate gradient method for solving sparse recovery problem with nonconvex penalty

Z Aminifard, A Hosseini, S Babaie-Kafaki - Signal Processing, 2022 - Elsevier
Sparse recovery is a strategy for effectively reconstructing a signal by obtaining sparse
solutions of underdetermined linear systems. As an important feature of a signal, sparsity is …

A new generalized shrinkage conjugate gradient method for sparse recovery

H Esmaeili, S Shabani, M Kimiaei - Calcolo, 2019 - Springer
In this paper, a new procedure, called generalized shrinkage conjugate gradient (GSCG), is
presented to solve the ℓ _1 ℓ 1-regularized convex minimization problem. In GSCG, we …

Exact Penalty Function for Norm Minimization over the Stiefel Manifold

N Xiao, X Liu, Y Yuan - SIAM Journal on Optimization, 2021 - SIAM
2,1 norm minimization with orthogonality constraints, which comprise a feasible region
called the Stiefel manifold, has wide applications in statistics and data science. The state-of …

A diagonally scaled Newton-type proximal method for minimization of the models with nonsmooth composite cost functions

Z Aminifard, S Babaie–Kafaki - Computational and Applied Mathematics, 2023 - Springer
Based on the memoryless BFGS (Broyden–Fletcher–Goldfarb–Shanno) updating formula of
a recent well-structured diagonal approximation of the Hessian, we propose an improved …

A novel low-rank matrix approximation algorithm for face denoising and background/foreground separation

J Zhao - Computational and Applied Mathematics, 2022 - Springer
Low-rank matrix recovery from an observation data matrix has received considerable
attention in recent years, which has a wide range of applications in pattern recognition and …

An approximate Newton-type proximal method using symmetric rank-one updating formula for minimizing the nonsmooth composite functions

Z Aminifard, S Babaie-Kafaki - Optimization Methods and Software, 2023 - Taylor & Francis
Founded upon the scaled memoryless symmetric rank-one updating formula, we propose an
approximation of the Newton-type proximal strategy for minimizing the nonsmooth …

[HTML][HTML] Low-rank and sparse matrices fitting algorithm for low-rank representation

J Zhao, L Zhao - Computers & Mathematics with Applications, 2020 - Elsevier
In real world, especially in the field of pattern recognition, a matrix formed from images,
visions, speech sounds or so forth under certain conditions usually subjects to a low-rank …

Alternating direction and Taylor expansion minimization algorithms for unconstrained nuclear norm optimization

J Zhao, Q Feng, L Zhao - Numerical Algorithms, 2019 - Springer
In the past decade, robust principal component analysis (RPCA) and low-rank matrix
completion (LRMC), as two very important optimization problems with the view of recovering …

Adaptive stochastic gradient descent for large-scale learning problems

Z Yang, L Ma - 2022 - researchsquare.com
As an effective strategy to enhance stochastic optimization, determining an appropriate step
size sequence when performing these algorithms has been warmly encouraged in solving …

[PDF][PDF] Nonmonotone variable metric Barzilai-Borwein method for composite minimization problem

X Guo, C Xu, Z Zhu, B Zhang - AIMS Mathematics, 2024 - aimspress.com
In this study, we develop a nonmonotone variable metric Barzilai-Borwein method for
minimizing the sum of a smooth function and a convex, possibly nondifferentiable, function …