Open Access Open Access  Restricted Access Subscription or Fee Access

Hybrid Algorithm for Neural Network Training

K. B. Manwade

Abstract


In supervised learning BP (Back Propagation) algorithm is proved to be the best algorithm for neural network training. But with BP algorithm Learning often takes long time to converge, and it may fall into local minima. By varying the learning rate, the local minima problem can be tackled. In this paper, we describe the hybrid learning approach to optimization neural network training process. By this approach, the first off all the global minima is found and with reference to it the local minima is found. A global minimum is for minimizing the overall error in the network and local minima is generalization of all patterns. The simulation results show that the proposed hybrid algorithm outperforms over traditional BP algorithm.


Keywords


Back-Propagation, Local Minima, Global Minima, Generalization

Full Text:

PDF

References


Sanjay Ranka, Kishan Mehrotra, Chilukuri Mohan, “Elements of artificial neural network”, 2nd Edition, Pearson International Publication.

Richard Palmer, John A. Hertz, Anders S. Krough, ”Introduction to theory of neural network computation” Volume 1, Santa Fe Institute Studies in Science of Complexity.

Z. Zainuddin, N. Mahat and Y. Abu Hasan, “Improving the convergence of the back propagation algorithm using local adaptive techniques”, Published at Proceeding of WASET Volume 1, Jan 2005 ISSN 1307-6884.

Zhong-Jin Yang, “A hybrid methodology for improving generalization performance of neural network”, Published at Proceedings of the Fifth International Conference on Machine Learning and Cybernetics, Dalian, 13-16 August 2006.

Mercedes Fernandez-Redondo, Carlos Hernandez-Espinosa, “Weight initialization methods for multilayer feed forward”, Proceeding of European Symposium on artificial neural networks, Belgium, April 2001.


Refbacks

  • There are currently no refbacks.