Adversarial robustness of neural networks from the perspective of Lipschitz calculus: A survey

MM Zühlke, D Kudenko - ACM Computing Surveys, 2024 - dl.acm.org
We survey the adversarial robustness of neural networks from the perspective of Lipschitz
calculus in a unifying fashion by expressing models, attacks and safety guarantees, that is, a …

Regularization of polynomial networks for image recognition

GG Chrysos, B Wang, J Deng… - Proceedings of the …, 2023 - openaccess.thecvf.com
Abstract Deep Neural Networks (DNNs) have obtained impressive performance across
tasks, however they still remain as black boxes, eg, hard to theoretically analyze. At the …

Extrapolation and spectral bias of neural nets with hadamard product: a polynomial net study

Y Wu, Z Zhu, F Liu, G Chrysos… - Advances in neural …, 2022 - proceedings.neurips.cc
Neural tangent kernel (NTK) is a powerful tool to analyze training dynamics of neural
networks and their generalization bounds. The study on NTK has been devoted to typical …

Pay attention to your loss: understanding misconceptions about lipschitz neural networks

L Béthune, T Boissin, M Serrurier… - Advances in …, 2022 - proceedings.neurips.cc
Lipschitz constrained networks have gathered considerable attention in the deep learning
community, with usages ranging from Wasserstein distance estimation to the training of …

Sound and complete verification of polynomial networks

E Abad Rocamora, MF Sahin, F Liu… - Advances in …, 2022 - proceedings.neurips.cc
Abstract Polynomial Networks (PNs) have demonstrated promising performance on face and
image recognition recently. However, robustness of PNs is unclear and thus obtaining …

Random Polynomial Neural Networks: Analysis and Design

W Huang, Y Xiao, SK Oh, W Pedrycz… - IEEE Transactions on …, 2023 - ieeexplore.ieee.org
In this article, we propose the concept of random polynomial neural networks (RPNNs)
realized based on the architecture of polynomial neural networks (PNNs) with random …

Tensor methods in deep learning

Y Panagakis, J Kossaifi, GG Chrysos, J Oldfield… - Signal Processing and …, 2024 - Elsevier
Tensors are multidimensional arrays that can naturally represent data and mappings of
multiple dimensions, playing a central role in modern deep learning. Indeed, the basic …

Architecture Design: From Neural Networks to Foundation Models

G Chrysos - 2024 IEEE 11th International Conference on Data …, 2024 - ieeexplore.ieee.org
Historically, we are taught to use task-dependent architecture design and objectives to
tackle data science tasks. Counter intuitively, this dogma has been proven (partly) wrong by …

On the study of sample complexity for polynomial neural networks

C Pan, C Zhang - arXiv preprint arXiv:2207.08896, 2022 - arxiv.org
As a general type of machine learning approach, artificial neural networks have established
state-of-art benchmarks in many pattern recognition and data analysis tasks. Among various …

Signature Estimation and Signal Recovery Using Median of Means

S Chrétien, R Vaucher - … Conference on Geometric Science of Information, 2023 - Springer
The theory of Signatures [,] is a fast growing field which has demonstrated wide applicability
to a large range of fields, from finance to health monitoring [,,]. Computing signatures often …