An overview of low-rank matrix recovery from incomplete observations

MA Davenport, J Romberg - IEEE Journal of Selected Topics in …, 2016 - ieeexplore.ieee.org
Low-rank matrices play a fundamental role in modeling and computational methods for
signal processing and machine learning. In many applications where low-rank matrices …

Complete dictionary recovery over the sphere I: Overview and the geometric picture

J Sun, Q Qu, J Wright - IEEE Transactions on Information …, 2016 - ieeexplore.ieee.org
We consider the problem of recovering a complete (ie, square and invertible) matrix A 0,
from Y∈ R n× p with Y= A 0 X 0, provided X 0 is sufficiently sparse. This recovery problem is …

Randomized numerical linear algebra: Foundations and algorithms

PG Martinsson, JA Tropp - Acta Numerica, 2020 - cambridge.org
This survey describes probabilistic algorithms for linear algebraic computations, such as
factorizing matrices and solving linear systems. It focuses on techniques that have a proven …

Direction-of-arrival estimation for coprime array via virtual array interpolation

C Zhou, Y Gu, X Fan, Z Shi, G Mao… - IEEE Transactions on …, 2018 - ieeexplore.ieee.org
Coprime arrays can achieve an increased number of degrees of freedom by deriving the
equivalent signals of a virtual array. However, most existing methods fail to utilize all …

Scalable methods for 8-bit training of neural networks

R Banner, I Hubara, E Hoffer… - Advances in neural …, 2018 - proceedings.neurips.cc
Abstract Quantized Neural Networks (QNNs) are often used to improve network efficiency
during the inference phase, ie after the network has been trained. Extensive research in the …

Atomo: Communication-efficient learning via atomic sparsification

H Wang, S Sievert, S Liu, Z Charles… - Advances in neural …, 2018 - proceedings.neurips.cc
Distributed model training suffers from communication overheads due to frequent gradient
updates transmitted between compute nodes. To mitigate these overheads, several studies …

Learning with differentiable pertubed optimizers

Q Berthet, M Blondel, O Teboul… - Advances in neural …, 2020 - proceedings.neurips.cc
Abstract Machine learning pipelines often rely on optimizers procedures to make discrete
decisions (eg, sorting, picking closest neighbors, or shortest paths). Although these discrete …

An introduction to matrix concentration inequalities

JA Tropp - Foundations and Trends® in Machine Learning, 2015 - nowpublishers.com
Random matrices now play a role in many areas of theoretical, applied, and computational
mathematics. Therefore, it is desirable to have tools for studying random matrices that are …

Image reconstruction: From sparsity to data-adaptive methods and machine learning

S Ravishankar, JC Ye, JA Fessler - Proceedings of the IEEE, 2019 - ieeexplore.ieee.org
The field of medical image reconstruction has seen roughly four types of methods. The first
type tended to be analytical methods, such as filtered backprojection (FBP) for X-ray …

[图书][B] An invitation to compressive sensing

S Foucart, H Rauhut, S Foucart, H Rauhut - 2013 - Springer
This first chapter formulates the objectives of compressive sensing. It introduces the
standard compressive problem studied throughout the book and reveals its ubiquity in many …