作者
N Mert Vural, Fatih Ilhan, Selim F Yilmaz, Salih Ergüt, Suleyman Serdar Kozat
发表日期
2021/6/17
期刊
IEEE Transactions on Neural Networks and Learning Systems
卷号
33
期号
12
页码范围
7632-7643
出版商
IEEE
简介
Recurrent neural networks (RNNs) are widely used for online regression due to their ability to generalize nonlinear temporal dependencies. As an RNN model, long short-term memory networks (LSTMs) are commonly preferred in practice, as these networks are capable of learning long-term dependencies while avoiding the vanishing gradient problem. However, due to their large number of parameters, training LSTMs requires considerably longer training time compared to simple RNNs (SRNNs). In this article, we achieve the online regression performance of LSTMs with SRNNs efficiently. To this end, we introduce a first-order training algorithm with a linear time complexity in the number of parameters. We show that when SRNNs are trained with our algorithm, they provide very similar regression performance with the LSTMs in two to three times shorter training time. We provide strong theoretical analysis to …
引用总数
20212022202320241253
学术搜索中的文章
NM Vural, F Ilhan, SF Yilmaz, S Ergüt, SS Kozat - IEEE Transactions on Neural Networks and Learning …, 2021
N Mert Vural, F Ilhan, SF Yilmaz, S Ergüt, SS Kozat - arXiv e-prints, 2020