Tinyml meets iot: A comprehensive survey

L Dutta, S Bharali - Internet of Things, 2021 - Elsevier
The rapid growth in miniaturization of low-power embedded devices and advancement in
the optimization of machine learning (ML) algorithms have opened up a new prospect of the …

Machine learning for microcontroller-class hardware: A review

SS Saha, SS Sandha, M Srivastava - IEEE Sensors Journal, 2022 - ieeexplore.ieee.org
The advancements in machine learning (ML) opened a new opportunity to bring intelligence
to the low-end Internet-of-Things (IoT) nodes, such as microcontrollers. Conventional ML …

Resurrecting recurrent neural networks for long sequences

A Orvieto, SL Smith, A Gu, A Fernando… - International …, 2023 - proceedings.mlr.press
Abstract Recurrent Neural Networks (RNNs) offer fast inference on long sequences but are
hard to optimize and slow to train. Deep state-space models (SSMs) have recently been …

On the parameterization and initialization of diagonal state space models

A Gu, K Goel, A Gupta, C Ré - Advances in Neural …, 2022 - proceedings.neurips.cc
State space models (SSM) have recently been shown to be very effective as a deep learning
layer as a promising alternative to sequence models such as RNNs, CNNs, or Transformers …

Efficiently modeling long sequences with structured state spaces

A Gu, K Goel, C Ré - arXiv preprint arXiv:2111.00396, 2021 - arxiv.org
A central goal of sequence modeling is designing a single principled model that can
address sequence data across a range of modalities and tasks, particularly on long-range …

Simplified state space layers for sequence modeling

JTH Smith, A Warrington, SW Linderman - arXiv preprint arXiv:2208.04933, 2022 - arxiv.org
Models using structured state space sequence (S4) layers have achieved state-of-the-art
performance on long-range sequence modeling tasks. An S4 layer combines linear state …

Training spiking neural networks using lessons from deep learning

JK Eshraghian, M Ward, EO Neftci… - Proceedings of the …, 2023 - ieeexplore.ieee.org
The brain is the perfect place to look for inspiration to develop more efficient neural
networks. The inner workings of our synapses and neurons provide a glimpse at what the …

Combining recurrent, convolutional, and continuous-time models with linear state space layers

A Gu, I Johnson, K Goel, K Saab… - Advances in neural …, 2021 - proceedings.neurips.cc
Recurrent neural networks (RNNs), temporal convolutions, and neural differential equations
(NDEs) are popular families of deep learning models for time-series data, each with unique …

Diagonal state spaces are as effective as structured state spaces

A Gupta, A Gu, J Berant - Advances in Neural Information …, 2022 - proceedings.neurips.cc
Modeling long range dependencies in sequential data is a fundamental step towards
attaining human-level performance in many modalities such as text, vision, audio and video …

Film: Frequency improved legendre memory model for long-term time series forecasting

T Zhou, Z Ma, Q Wen, L Sun, T Yao… - Advances in neural …, 2022 - proceedings.neurips.cc
Recent studies have shown that deep learning models such as RNNs and Transformers
have brought significant performance gains for long-term forecasting of time series because …