A backpropagation algorithm with adaptive learning rate and momentum coefficient

Chien Cheng Yu, Bin-Da Liu

Research output: Contribution to conferencePaper

83 Citations (Scopus)

Abstract

Slower convergence and longer training times are the disadvantages often mentioned when the conventional back-propagation (BP) algorithm are compared with other competing techniques. In addition, in the conventional BP algorithm, the learning rate is fixed and that it is uniform for all weights in a layer. In this paper, we propose an efficient acceleration technique -BPALM (Back-Propagation with Adaptive Learning rate and Momentum term), which is based on the conventional BP algorithm by employing an adaptive learning rate and momentum factor, where the learning rate and the momentum rate are adjusted at each iteration, to reduce the training time is presented. Simulation results indicate a superior convergence speed as compared to other competing methods.

Original languageEnglish
Pages1218-1223
Number of pages6
Publication statusPublished - 2002 Jan 1
Event2002 International Joint Conference on Neural Networks (IJCNN '02) - Honolulu, HI, United States
Duration: 2002 May 122002 May 17

Other

Other2002 International Joint Conference on Neural Networks (IJCNN '02)
CountryUnited States
CityHonolulu, HI
Period02-05-1202-05-17

    Fingerprint

All Science Journal Classification (ASJC) codes

  • Software
  • Artificial Intelligence

Cite this

Yu, C. C., & Liu, B-D. (2002). A backpropagation algorithm with adaptive learning rate and momentum coefficient. 1218-1223. Paper presented at 2002 International Joint Conference on Neural Networks (IJCNN '02), Honolulu, HI, United States.