Hybrid evolution of convolutional networks

B Cheung, C Sable - … on machine learning and applications and …, 2011 - ieeexplore.ieee.org
2011 10th international conference on machine learning and …, 2011ieeexplore.ieee.org
With the increasing trend of neural network models towards larger structures with more
layers, we expect a corresponding exponential increase in the number of possible
architectures. In this paper, we apply a hybrid evolutionary search procedure to define the
initialization and architectural parameters of convolutional networks, one of the first
successful deep network models. We make use of stochastic diagonal Levenberg-Marquardt
to accelerate the convergence of training, lowering the time cost of fitness evaluation. Using …
With the increasing trend of neural network models towards larger structures with more layers, we expect a corresponding exponential increase in the number of possible architectures. In this paper, we apply a hybrid evolutionary search procedure to define the initialization and architectural parameters of convolutional networks, one of the first successful deep network models. We make use of stochastic diagonal Levenberg-Marquardt to accelerate the convergence of training, lowering the time cost of fitness evaluation. Using parameters found from the evolutionary search together with absolute value and local contrast normalization preprocessing between layers, we achieve the best known performance on several of the MNIST Variations, rectangles-image and convex image datasets.
ieeexplore.ieee.org
以上显示的是最相近的搜索结果。 查看全部搜索结果