Deep learning enabled state of charge, state of health and remaining useful life estimation for smart battery management system: Methods, implementations, issues …

MSH Lipu, S Ansari, MS Miah, ST Meraj, K Hasan… - Journal of Energy …, 2022 - Elsevier
State of Charge (SOC), state of health (SOH), and remaining useful life (RUL) are the crucial
indexes used in the assessment of electric vehicle (EV) battery management systems (BMS) …

Intelligent SOX estimation for automotive battery management systems: state-of-the-art deep learning approaches, open issues, and future research opportunities

MS Hossain Lipu, TF Karim, S Ansari, MS Miah… - Energies, 2022 - mdpi.com
Real-time battery SOX estimation including the state of charge (SOC), state of energy (SOE),
and state of health (SOH) is the crucial evaluation indicator to assess the performance of …

A survey on the optimization of neural network accelerators for micro-ai on-device inference

AN Mazumder, J Meng, HA Rashid… - IEEE Journal on …, 2021 - ieeexplore.ieee.org
Deep neural networks (DNNs) are being prototyped for a variety of artificial intelligence (AI)
tasks including computer vision, data analytics, robotics, etc. The efficacy of DNNs coincides …

A 28nm 27.5 TOPS/W approximate-computing-based transformer processor with asymptotic sparsity speculating and out-of-order computing

Y Wang, Y Qin, D Deng, J Wei, Y Zhou… - … solid-state circuits …, 2022 - ieeexplore.ieee.org
Recently, Transformer-based models have achieved tremendous success in many AI fields,
from NLP to CV, using the attention mechanism 1–3. This mechanism captures the global …

A surrogate gradient spiking baseline for speech command recognition

A Bittar, PN Garner - Frontiers in Neuroscience, 2022 - frontiersin.org
Artificial neural networks (ANNs) are the basis of recent advances in artificial intelligence
(AI); they typically use real valued neuron responses. By contrast, biological neurons are …

Spartus: A 9.4 TOp/s FPGA-based LSTM accelerator exploiting spatio-temporal sparsity

C Gao, T Delbruck, SC Liu - IEEE Transactions on Neural …, 2022 - ieeexplore.ieee.org
Long short-term memory (LSTM) recurrent networks are frequently used for tasks involving
time-sequential data, such as speech recognition. Unlike previous LSTM accelerators that …

EdgeDRNN: Recurrent neural network accelerator for edge inference

C Gao, A Rios-Navarro, X Chen, SC Liu… - IEEE Journal on …, 2020 - ieeexplore.ieee.org
Low-latency, low-power portable recurrent neural network (RNN) accelerators offer powerful
inference capabilities for real-time applications such as IoT, robotics, and human-machine …

Tinyvers: A tiny versatile system-on-chip with state-retentive eMRAM for ML inference at the extreme edge

V Jain, S Giraldo, J De Roose, L Mei… - IEEE Journal of Solid …, 2023 - ieeexplore.ieee.org
Extreme edge devices or Internet-of-Things (IoT) nodes require both ultra-low power (ULP)
always-on (AON) processing as well as the ability to do on-demand sampling and …

Digital versus analog artificial intelligence accelerators: Advances, trends, and emerging designs

J Seo, J Saikia, J Meng, W He, H Suh… - IEEE Solid-State …, 2022 - ieeexplore.ieee.org
For state-of-the-art artificial intelligence (AI) accelerators, there have been large advances in
both all-digital and analog/mixed-signal circuit-based designs. This article presents a …

An energy-efficient transformer processor exploiting dynamic weak relevances in global attention

Y Wang, Y Qin, D Deng, J Wei, Y Zhou… - IEEE Journal of Solid …, 2022 - ieeexplore.ieee.org
Transformer-based models achieve tremendous success in many artificial intelligence (AI)
tasks, outperforming conventional convolution neural networks (CNNs) from natural …