Leveraging the ReRAM crossbar-based In-Memory-Computing (IMC) to accelerate single task DNN inference has been widely studied. However, using the ReRAM crossbar for …
Continual Learning (CL) allows applications such as user personalization and household robots to learn on the fly and adapt to context. This is an important feature when context …
Y Luo, S Yu - IEEE Transactions on Computers, 2021 - ieeexplore.ieee.org
As AI applications become pervasive on edge device, incrementally learning new tasks is demanded for deep neural network (DNN) models. In this article, we proposed AILC, a …
Y Li, W Zhang, X Xu, Y He, D Dong… - Advanced Intelligent …, 2022 - Wiley Online Library
Artificial neural networks have acquired remarkable achievements in the field of artificial intelligence. However, it suffers from catastrophic forgetting when dealing with continual …
L Vorabbi, D Maltoni, G Borghi, S Santi - arXiv preprint arXiv:2401.09916, 2024 - arxiv.org
On-device learning remains a formidable challenge, especially when dealing with resource- constrained devices that have limited computational capabilities. This challenge is primarily …
With many devices deployed at the extreme edge in dynamic environments the ability to learn continually on the device is a fast-emerging trend for ultra-low-power Microcontrollers …
In the last few years, research and development on Deep Learning models & techniques for ultra-low-power devices–in a word, TinyML–has mainly focused on a train-then-deploy …
S Jeon, X Ma, KI Kim, M Jeon - arXiv preprint arXiv:2406.04772, 2024 - arxiv.org
On-device continual learning (CL) requires the co-optimization of model accuracy and resource efficiency to be practical. This is extremely challenging because it must preserve …
Continually learning new classes from few training examples without forgetting previous old classes demands a flexible architecture with an inevitably growing portion of storage, in …