A survey of intelligent reflecting surfaces (IRSs): Towards 6G wireless communication networks

J Zhao - arXiv preprint arXiv:1907.04789, 2019 - arxiv.org
Intelligent reflecting surfaces (IRSs) tune wireless environments to increase spectrum and
energy efficiencies. In view of much recent attention to the IRS concept as a promising …

Distributed nonconvex constrained optimization over time-varying digraphs

G Scutari, Y Sun - Mathematical Programming, 2019 - Springer
This paper considers nonconvex distributed constrained optimization over networks,
modeled as directed (possibly time-varying) graphs. We introduce the first algorithmic …

Convex optimization algorithms in medical image reconstruction—in the age of AI

J Xu, F Noo - Physics in Medicine & Biology, 2022 - iopscience.iop.org
The past decade has seen the rapid growth of model based image reconstruction (MBIR)
algorithms, which are often applications or adaptations of convex optimization algorithms …

[图书][B] Modern nonconvex nondifferentiable optimization

Y Cui, JS Pang - 2021 - SIAM
Mathematical optimization has always been at the heart of engineering, statistics, and
economics. In these applied domains, optimization concepts and methods have often been …

Fast L1–L2 minimization via a proximal operator

Y Lou, M Yan - Journal of Scientific Computing, 2018 - Springer
This paper aims to develop new and fast algorithms for recovering a sparse vector from a
small number of measurements, which is a fundamental problem in the field of compressive …

A proximal difference-of-convex algorithm with extrapolation

B Wen, X Chen, TK Pong - Computational optimization and applications, 2018 - Springer
We consider a class of difference-of-convex (DC) optimization problems whose objective is
level-bounded and is the sum of a smooth convex function with Lipschitz gradient, a proper …

Transformed ℓ1 regularization for learning sparse deep neural networks

R Ma, J Miao, L Niu, P Zhang - Neural Networks, 2019 - Elsevier
Abstract Deep Neural Networks (DNNs) have achieved extraordinary success in numerous
areas. However, DNNs often carry a large number of weight parameters, leading to the …

A scale-invariant approach for sparse signal recovery

Y Rahimi, C Wang, H Dong, Y Lou - SIAM Journal on Scientific Computing, 2019 - SIAM
In this paper, we study the ratio of the L_1 and L_2 norms, denoted as L_1/L_2, to promote
sparsity. Due to the nonconvexity and nonlinearity, there has been little attention to this scale …

Difference-of-convex learning: directional stationarity, optimality, and sparsity

M Ahn, JS Pang, J Xin - SIAM Journal on Optimization, 2017 - SIAM
This paper studies a fundamental bicriteria optimization problem for variable selection in
statistical learning; the two criteria are a loss/residual function and a model control (also …

Parallel and distributed successive convex approximation methods for big-data optimization

A Nedić, JS Pang, G Scutari, Y Sun, G Scutari… - Multi-Agent Optimization …, 2018 - Springer
Recent years have witnessed a surge of interest in parallel and distributed optimization
methods for large-scale systems. In particular, nonconvex large-scale optimization problems …