作者
C Lee Giles, Christian W Omlin
发表日期
1993/1/1
期刊
Connection Science
卷号
5
期号
3-4
页码范围
307-337
出版商
Taylor & Francis Group
简介
Recurrent neural networks readily process, learn and generate temporal sequences. In addition, they have been shown to have impressive computational power. Recurrent neural networks can be trained with symbolic string examples encoded as temporal sequences to behave like sequential finite slate recognizers. We discuss methods for extracting, inserting and refining symbolic grammatical rules for recurrent networks. This paper discusses various issues: how rules are inserted into recurrent networks, how they affect training and generalization, and how those rules can be checked and corrected. The capability of exchanging information between a symbolic representation (grammatical rules)and a connectionist representation (trained weights) has interesting implications. After partially known rules are inserted, recurrent networks can be trained to preserve inserted rules that were correct and to correct …
引用总数
1994199519961997199819992000200120022003200420052006200720082009201020112012201320142015201620172018201920202021202220232024597107913116343332232361112212