A structured organization of information is typically required by symbolic processing. On the other hand, most connectionist models assume that data are organized according to …
AB Tickle, R Andrews, M Golea… - IEEE Transactions on …, 1998 - ieeexplore.ieee.org
To date, the preponderance of techniques for eliciting the knowledge embedded in trained artificial neural networks (ANN's) has focused primarily on extracting rule-based …
DA Medler - Neural computing surveys, 1998 - Citeseer
Connectionist research is firmly established within the scientific community, especially within the multi-disciplinary field of cognitive science. This diversity, however, has created an …
CW Omlin, KK Thornber… - IEEE Transactions on …, 1998 - ieeexplore.ieee.org
There has been an increased interest in combining fuzzy systems with neural networks because fuzzy neural systems merge the advantages of both paradigms. On the one hand …
We introduce a model for analog computation with discrete time in the presence of analog noise that is flexible enough to cover the most important concrete cases, such as noisy …
P Arun - Neural Computing Surveys, 1998 - Citeseer
In August 1998 Dave Touretzky asked on the connectionists e-mailing list,\Is connectionist symbol processing dead?" This query lead to an interesting discussion and exchange of …
Y Kalinke, H Lehmann - Australian Joint Conference on Artificial …, 1998 - Springer
In the paper we address the problem of computation in recurrent neural networks (RNN). In the first part we provide a formal analysis of the dynamical behavior of a RNN with a single …
W Maass - Network: Computation in Neural Systems, 1998 - iopscience.iop.org
A simple extension of standard neural network models is introduced which provides a model for neural computations that involve both firing rates and firing correlations. Such an …
L Firoiu, T Oates, PR Cohen - … 4th International Colloquium, ICGI-98 Ames …, 1998 - Springer
We consider the problem of learning a finite automaton with recurrent neural networks from positive evidence. We train an Elman recurrent neural network with a set of sentences in a …