作者
C Lee Giles, Christian W Omlin
发表日期
1994/9
期刊
IEEE transactions on neural networks
卷号
5
期号
5
页码范围
848-851
出版商
IEEE
简介
Determining the architecture of a neural network is an important issue for any learning task. For recurrent neural networks no general methods exist that permit the estimation of the number of layers of hidden neurons, the size of layers or the number of weights. We present a simple pruning heuristic that significantly improves the generalization performance of trained recurrent networks. We illustrate this heuristic by training a fully recurrent neural network on positive and negative strings of a regular grammar. We also show that rules extracted from networks trained with this pruning heuristic are more consistent with the rules to be learned. This performance improvement is obtained by pruning and retraining the networks. Simulations are shown for training and pruning a recurrent neural net on strings generated by two regular grammars, a randomly-generated 10-state grammar and an 8-state, triple-parity grammar …
引用总数
19931994199519961997199819992000200120022003200420052006200720082009201020112012201320142015201620172018201920202021202220232024589810658653255862146133335278325
学术搜索中的文章