Deep learning in neural networks: An overview

J Schmidhuber - Neural networks, 2015 - Elsevier
In recent years, deep artificial neural networks (including recurrent ones) have won
numerous contests in pattern recognition and machine learning. This historical survey …

An alternative view of the mental lexicon

JL Elman - Trends in cognitive sciences, 2004 - cell.com
An essential aspect of knowing language is knowing the words of that language. This
knowledge is usually thought to reside in the mental lexicon, a kind of dictionary that …

Inferring algorithmic patterns with stack-augmented recurrent nets

A Joulin, T Mikolov - Advances in neural information …, 2015 - proceedings.neurips.cc
Despite the recent achievements in machine learning, we are still very far from achieving
real artificial intelligence. In this paper, we discuss the limitations of standard deep learning …

[PDF][PDF] A guide to recurrent neural networks and backpropagation

M Boden - the Dallas project, 2002 - wiki.eecs.yorku.ca
This paper provides guidance to some of the concepts surrounding recurrent neural
networks. Contrary to feedforward networks, recurrent networks can be sensitive, and be …

[图书][B] A field guide to dynamical recurrent networks

JF Kolen, SC Kremer - 2001 - books.google.com
Acquire the tools for understanding new architectures and algorithms of dynamical recurrent
networks (DRNs) from this valuable field guide, which documents recent forays into artificial …

Kalman filters improve LSTM network performance in problems unsolvable by traditional recurrent nets

JA Pérez-Ortiz, FA Gers, D Eck, J Schmidhuber - Neural Networks, 2003 - Elsevier
The long short-term memory (LSTM) network trained by gradient descent solves difficult
problems which traditional recurrent neural networks in general cannot. We have recently …

Lexical knowledge without a lexicon?

JL Elman - The mental lexicon, 2011 - jbe-platform.com
Although for many years a sharp distinction has been made in language research between
rules and words—with primary interest on rules—this distinction is now blurred in many …

Learning nonregular languages: A comparison of simple recurrent networks and LSTM

J Schmidhuber, F Gers, D Eck - Neural computation, 2002 - ieeexplore.ieee.org
Learning Nonregular Languages: A Comparison of Simple Recurrent Networks and LSTM Page
1 NOTE Communicated by Yoshua Bengio Learning Nonregular Languages: A Comparison of …

Comparing feedforward and recurrent neural network architectures with human behavior in artificial grammar learning

A Alamia, V Gauducheau, D Paisios, R VanRullen - Scientific reports, 2020 - nature.com
In recent years artificial neural networks achieved performance close to or better than
humans in several domains: tasks that were previously human prerogatives, such as …

Memory-augmented recurrent neural networks can learn generalized dyck languages

M Suzgun, S Gehrmann, Y Belinkov… - arXiv preprint arXiv …, 2019 - arxiv.org
We introduce three memory-augmented Recurrent Neural Networks (MARNNs) and explore
their capabilities on a series of simple language modeling tasks whose solutions require …