JL Elman - Trends in cognitive sciences, 2004 - cell.com
An essential aspect of knowing language is knowing the words of that language. This knowledge is usually thought to reside in the mental lexicon, a kind of dictionary that …
A Joulin, T Mikolov - Advances in neural information …, 2015 - proceedings.neurips.cc
Despite the recent achievements in machine learning, we are still very far from achieving real artificial intelligence. In this paper, we discuss the limitations of standard deep learning …
M Boden - the Dallas project, 2002 - wiki.eecs.yorku.ca
This paper provides guidance to some of the concepts surrounding recurrent neural networks. Contrary to feedforward networks, recurrent networks can be sensitive, and be …
Acquire the tools for understanding new architectures and algorithms of dynamical recurrent networks (DRNs) from this valuable field guide, which documents recent forays into artificial …
The long short-term memory (LSTM) network trained by gradient descent solves difficult problems which traditional recurrent neural networks in general cannot. We have recently …
JL Elman - The mental lexicon, 2011 - jbe-platform.com
Although for many years a sharp distinction has been made in language research between rules and words—with primary interest on rules—this distinction is now blurred in many …
Learning Nonregular Languages: A Comparison of Simple Recurrent Networks and LSTM Page 1 NOTE Communicated by Yoshua Bengio Learning Nonregular Languages: A Comparison of …
In recent years artificial neural networks achieved performance close to or better than humans in several domains: tasks that were previously human prerogatives, such as …
We introduce three memory-augmented Recurrent Neural Networks (MARNNs) and explore their capabilities on a series of simple language modeling tasks whose solutions require …