Spatial functa: Scaling functa to imagenet classification and generation

M Bauer, E Dupont, A Brock, D Rosenbaum… - arXiv preprint arXiv …, 2023 - arxiv.org
Neural fields, also known as implicit neural representations, have emerged as a powerful
means to represent complex signals of various modalities. Based on this Dupont et al.(2022) …

How to Train Neural Field Representations: A Comprehensive Study and Benchmark

S Papa, R Valperga, D Knigge… - Proceedings of the …, 2024 - openaccess.thecvf.com
Neural fields (NeFs) have recently emerged as a versatile method for modeling signals of
various modalities including images shapes and scenes. Subsequently a number of works …

Wavelet Convolutions for Large Receptive Fields

SE Finder, R Amoyal, E Treister, O Freifeld - arXiv preprint arXiv …, 2024 - arxiv.org
In recent years, there have been attempts to increase the kernel size of Convolutional
Neural Nets (CNNs) to mimic the global receptive field of Vision Transformers'(ViTs) self …

Streamable neural fields

J Cho, S Nam, D Rho, JH Ko, E Park - European Conference on Computer …, 2022 - Springer
Neural fields have emerged as a new data representation paradigm and have shown
remarkable success in various signal representations. Since they preserve signals in their …

Hyperdiffusion: Generating implicit neural fields with weight-space diffusion

Z Erkoç, F Ma, Q Shan, M Nießner… - Proceedings of the …, 2023 - openaccess.thecvf.com
Implicit neural fields, typically encoded by a multilayer perceptron (MLP) that maps from
coordinates (eg, xyz) to signals (eg, signed distances), have shown remarkable promise as …

Neural Fields as Distributions: Signal Processing Beyond Euclidean Space

D Rebain, S Yazdani, KM Yi… - Proceedings of the …, 2024 - openaccess.thecvf.com
Neural fields have emerged as a powerful and broadly applicable method for representing
signals. However in contrast to classical discrete digital signal processing the portfolio of …

A downsampled variant of imagenet as an alternative to the cifar datasets

P Chrabaszcz, I Loshchilov, F Hutter - arXiv preprint arXiv:1707.08819, 2017 - arxiv.org
The original ImageNet dataset is a popular large-scale benchmark for training Deep Neural
Networks. Since the cost of performing experiments (eg, algorithm design, architecture …

Kdeformer: Accelerating transformers via kernel density estimation

A Zandieh, I Han, M Daliri… - … Conference on Machine …, 2023 - proceedings.mlr.press
Dot-product attention mechanism plays a crucial role in modern deep architectures (eg,
Transformer) for sequence modeling, however, naïve exact computation of this model incurs …

NeRN--Learning Neural Representations for Neural Networks

M Ashkenazi, Z Rimon, R Vainshtein, S Levi… - arXiv preprint arXiv …, 2022 - arxiv.org
Neural Representations have recently been shown to effectively reconstruct a wide range of
signals from 3D meshes and shapes to images and videos. We show that, when adapted …

Visual atoms: Pre-training vision transformers with sinusoidal waves

S Takashima, R Hayamizu, N Inoue… - Proceedings of the …, 2023 - openaccess.thecvf.com
Formula-driven supervised learning (FDSL) has been shown to be an effective method for
pre-training vision transformers, where ExFractalDB-21k was shown to exceed the pre …