Deep learning based object detection for resource constrained devices: Systematic review, future trends and challenges ahead

V Kamath, A Renuka - Neurocomputing, 2023 - Elsevier
Deep learning models are widely being employed for object detection due to their high
performance. However, the majority of applications that require object detection are …

Transformers learn shortcuts to automata

B Liu, JT Ash, S Goel, A Krishnamurthy… - arXiv preprint arXiv …, 2022 - arxiv.org
Algorithmic reasoning requires capabilities which are most naturally understood through
recurrent models of computation, like the Turing machine. However, Transformer models …

Provable guarantees for neural networks via gradient feature learning

Z Shi, J Wei, Y Liang - Advances in Neural Information …, 2023 - proceedings.neurips.cc
Neural networks have achieved remarkable empirical performance, while the current
theoretical analysis is not adequate for understanding their success, eg, the Neural Tangent …

Provable guarantees for nonlinear feature learning in three-layer neural networks

E Nichani, A Damian, JD Lee - Advances in Neural …, 2024 - proceedings.neurips.cc
One of the central questions in the theory of deep learning is to understand how neural
networks learn hierarchical features. The ability of deep networks to extract salient features …

On the approximation power of two-layer networks of random relus

D Hsu, CH Sanford, R Servedio… - … on Learning Theory, 2021 - proceedings.mlr.press
This paper considers the following question: how well can depth-two ReLU networks with
randomly initialized bottom-level weights represent smooth functions? We give near …

Mean-field inference methods for neural networks

M Gabrié - Journal of Physics A: Mathematical and Theoretical, 2020 - iopscience.iop.org
Abstract Machine learning algorithms relying on deep neural networks recently allowed a
great leap forward in artificial intelligence. Despite the popularity of their applications, the …

Optimization-based separations for neural networks

I Safran, J Lee - Conference on Learning Theory, 2022 - proceedings.mlr.press
Depth separation results propose a possible theoretical explanation for the benefits of deep
neural networks over shallower architectures, establishing that the former possess superior …

Merits and Demerits of Machine Learning of Ferroelectric, Flexoelectric, and Electrolytic Properties of Ceramic Materials

K Yasui - Materials, 2024 - mdpi.com
In the present review, the merits and demerits of machine learning (ML) in materials science
are discussed, compared with first principles calculations (PDE (partial differential …

Width is less important than depth in relu neural networks

G Vardi, G Yehudai, O Shamir - Conference on learning …, 2022 - proceedings.mlr.press
We solve an open question from Lu et al.(2017), by showing that any target network with
inputs in $\mathbb {R}^ d $ can be approximated by a width $ O (d) $ network (independent …

On the optimal memorization power of relu neural networks

G Vardi, G Yehudai, O Shamir - arXiv preprint arXiv:2110.03187, 2021 - arxiv.org
We study the memorization power of feedforward ReLU neural networks. We show that such
networks can memorize any $ N $ points that satisfy a mild separability assumption using …