Efficient global algorithm for supervised training of neural networks
Abstract
Training an artificial neural network involves the use of an efficient learning algorithm to program the weights and thresholds to implement the desired input-output mapping. In this paper we present a new optimization strategy based on the method of covering a feasible set that efficiently locates the global solution on the multimodal performance surface of a neural network. Performance bounds of the new algorithm are presented. Simulation results on Rumelhart's logic function test suite demonstrate the superiority of our algorithm to the backpropagation method and its faster modifications viz. Quickprop algorithm.