Exploiting non-idealities of resistive switching memories for efficient machine learning

V Yon, A Amirsoleimani, F Alibart, RG Melko… - Frontiers in …, 2022 - frontiersin.org
Novel computing architectures based on resistive switching memories (also known as
memristors or RRAMs) have been shown to be promising approaches for tackling the …

Energy efficient learning with low resolution stochastic domain wall synapse for deep neural networks

W Al Misba, M Lozano, D Querlioz, J Atulasimha - IEEE Access, 2022 - ieeexplore.ieee.org
We demonstrate extremely low resolution quantized (nominally 5-state) synapses with large
stochastic variations in synaptic weights can be energy efficient and achieve reasonably …

[HTML][HTML] Memristor based Spiking Neural Networks: Cooperative Development of Neural Network Architecture/Algorithms and Memristors

H Peng, L Gan, X Guo, HH Peng, L Gan, X Guo - Chip, 2024 - Elsevier
Inspired by the structure and principles of the human brain, spike neural networks (SNNs)
appear as the latest generation of artificial neural networks, attracting significant and …

Computational failure analysis of in-memory rram architecture for pattern classification cnn circuits

NL Prabhu, N Raghavan - IEEE Access, 2021 - ieeexplore.ieee.org
Power-efficient data processing subsystems performing millions of complex concurrent
arithmetic operations per second form part of today's essential solution required to meet the …

Neuromorphic In-Memory RRAM NAND/NOR Circuit Performance Analysis in a CNN Training Framework on the Edge for Low Power IoT

NL Prabhu, N Raghavan - IEEE Access, 2022 - ieeexplore.ieee.org
Training a CNN involves computationally intense optimization algorithms to fit the network
using a training dataset, to update the network weight for inferencing and then pattern …

A Dynamic Weight Quantization-Based Fault-tolerant Training Method for Ternary Memristive Neural Networks

Z Zhong, Z You, P Liu - … Test Conference in Asia (ITC-Asia), 2024 - ieeexplore.ieee.org
Memristors have the merits of small area, low power consumption, and non-volatility, which
are eminently suitable for storing the weights of neural networks. However, stuckat faults …

Robust Ex-situ Training of Memristor Crossbar-based Neural Network with Limited Precision Weights

R Hasan - Proceedings of the 18th ACM International Symposium …, 2023 - dl.acm.org
Memristor crossbar-based neural networks perform parallel operation in the analog domain.
Ex-situ training approach needs to program the predetermined resistance values in the …

Memristor Crossbar Scaling Limits and the Implementation of Large Neural Networks

R Hasan - 2023 - researchsquare.com
Memristor crossbar-based neural networks perform parallel operation in the analog domain.
Ex-situ training approach needs to program the predetermined resistance values to the …

Ternary Neural Networks Based on on/off Memristors: Set-Up and Training

A Morell, ED Machado, E Miranda, G Boquet… - Electronics, 2022 - mdpi.com
Neuromorphic systems based on hardware neural networks (HNNs) are expected to be an
energy and time-efficient computing architecture for solving complex tasks. In this paper, we …

Influence of Weight Transfer Error on Vector-Matrix Multiplication Using AND Array Architectures

J Yu, D Ryu, T Jang, WY Choi - 2023 Silicon Nanoelectronics …, 2023 - ieeexplore.ieee.org
The influence of weight transfer error on vector-matrix multiplication (VMM) in the AND array
architecture is investigated. In AND array, a constant voltage is applied to the drain, causing …