Spectral representations for convolutional neural networks

O Rippel, J Snoek, RP Adams - Advances in neural …, 2015 - proceedings.neurips.cc
Discrete Fourier transforms provide a significant speedup in the computation of convolutions
in deep learning. In this work, we demonstrate that, beyond its advantages for efficient …

Spectral-based convolutional neural network without multiple spatial-frequency domain switchings

SO Ayat, M Khalil-Hani, AAH Ab Rahman, H Abdellatef - Neurocomputing, 2019 - Elsevier
Recent researches have shown that spectral representation provides a significant speed-up
in the massive computation workload of convolution operations in the inference (feed …

DCFNet: Deep neural network with decomposed convolutional filters

Q Qiu, X Cheng, G Sapiro - International Conference on …, 2018 - proceedings.mlr.press
Abstract Filters in a Convolutional Neural Network (CNN) contain model parameters learned
from enormous amounts of data. In this paper, we suggest to decompose convolutional …

Very efficient training of convolutional neural networks using fast fourier transform and overlap-and-add

T Highlander, A Rodriguez - arXiv preprint arXiv:1601.06815, 2016 - arxiv.org
Convolutional neural networks (CNNs) are currently state-of-the-art for various classification
tasks, but are computationally expensive. Propagating through the convolutional layers is …

Understanding convolutional neural networks

J Koushik - arXiv preprint arXiv:1605.09081, 2016 - arxiv.org
Convoulutional Neural Networks (CNNs) exhibit extraordinary performance on a variety of
machine learning tasks. However, their mathematical properties and behavior are quite …

Band-limited training and inference for convolutional neural networks

A Dziedzic, J Paparrizos, S Krishnan… - International …, 2019 - proceedings.mlr.press
The convolutional layers are core building blocks of neural network architectures. In general,
a convolutional filter applies to the entire frequency spectrum of the input data. We explore …

Fast fourier convolution

L Chi, B Jiang, Y Mu - Advances in Neural Information …, 2020 - proceedings.neurips.cc
Vanilla convolutions in modern deep networks are known to operate locally and at fixed
scale (eg, the widely-adopted 3* 3 kernels in image-oriented tasks). This causes low efficacy …

Understanding training and generalization in deep learning by fourier analysis

ZJ Xu - arXiv preprint arXiv:1808.04295, 2018 - arxiv.org
Background: It is still an open research area to theoretically understand why Deep Neural
Networks (DNNs)---equipped with many more parameters than training data and trained by …

A fine-grained spectral perspective on neural networks

G Yang, H Salman - arXiv preprint arXiv:1907.10599, 2019 - arxiv.org
Are neural networks biased toward simple functions? Does depth always help learn more
complex features? Is training the last layer of a network as good as training all layers? How …

Convergent learning: Do different neural networks learn the same representations?

Y Li, J Yosinski, J Clune, H Lipson… - arXiv preprint arXiv …, 2015 - arxiv.org
Recent success in training deep neural networks have prompted active investigation into the
features learned on their intermediate layers. Such research is difficult because it requires …