Stochastic rounding (SR) randomly maps a real number x to one of the two nearest values in a finite precision number system. The probability of choosing either of these two numbers is …
The Posit™ Number System was introduced in 2017 as a replacement for floating-point numbers. Since then, the community has explored its application in several areas, such as …
Training Deep Neural Networks (DNN) with high efficiency can be difficult to achieve with native floating-point representations and commercially available hardware. Specialized …
The attention mechanisms of transformers effectively extract pertinent information from the input sequence. However, the quadratic complexity of self-attention wrt the sequence length …
While low-precision optimization has been widely used to accelerate deep learning, low- precision sampling remains largely unexplored. As a consequence, sampling is simply …
Abstract Graph Neural Networks (GNNs) are emerging ML models to analyze graph- structure data. Graph Neural Network (GNN) execution involves both compute-intensive and …
A Mahmoud, T Tambe, T Aloui… - 2022 52nd Annual …, 2022 - ieeexplore.ieee.org
This paper presents GoldenEye, a functional simulator with fault injection capabilities for common and emerging numerical formats, implemented for the PyTorch deep learning …
In recent times, a plethora of hardware accelerators have been put forth for graph learning applications such as vertex classification and graph classification. However, previous works …
J Zhao, P Zeng, G Shen, Q Chen… - IEEE Transactions on …, 2024 - ieeexplore.ieee.org
The attention mechanisms of transformers effectively extract pertinent information from the input sequence. However, the quadratic complexity of self-attention incurs heavy …