Fast neural networks without multipliers

M Marchesi, G Orlandi, F Piazza… - IEEE transactions on …, 1993 - ieeexplore.ieee.org
IEEE transactions on Neural Networks, 1993ieeexplore.ieee.org
Multilayer perceptrons (MLPs) with weight values restricted to powers of two or sums of
powers of two are introduced. In a digital implementation, these neural networks do not need
multipliers but only shift registers when computing in forward mode, thus saving chip area
and computation time. A learning procedure, based on backpropagation, is presented for
such neural networks. This learning procedure requires full real arithmetic and therefore
must be performed offline. Some test cases are presented, concerning MLPs with hidden …
Multilayer perceptrons (MLPs) with weight values restricted to powers of two or sums of powers of two are introduced. In a digital implementation, these neural networks do not need multipliers but only shift registers when computing in forward mode, thus saving chip area and computation time. A learning procedure, based on backpropagation, is presented for such neural networks. This learning procedure requires full real arithmetic and therefore must be performed offline. Some test cases are presented, concerning MLPs with hidden layers of different sizes, on pattern recognition problems. Such tests demonstrate the validity and the generalization capability of the method and give some insight into the behavior of the learning algorithm.< >
ieeexplore.ieee.org
以上显示的是最相近的搜索结果。 查看全部搜索结果