Adaptive learning algorithms to incorporate additional functional constraints into neural networks

Cited 23 time in webofscience Cited 0 time in scopus
  • Hit : 435
  • Download : 1
In this paper, adaptive learning algorithms to obtain better generalization performance are proposed. We specifically designed cost terms for the additional functionality based on the first- and second-order derivatives of neural activation at hidden layers. In the course of training, these additional cost functions penalize the input-to-output mapping sensitivity and high-frequency components in training data. A gradient-descent method results in hybrid learning rules to combine the error back-propagation, Hebbian rules, and the simple weight decay rules. However, additional computational requirements to the standard error back-propagation algorithm are almost negligible. Theoretical justifications and simulation results are given to verify the effectiveness of the proposed learning algorithms. (C) 2000 Elsevier Science B.V. All rights reserved.
Publisher
ELSEVIER SCIENCE BV
Issue Date
2000-11
Language
English
Article Type
Article; Proceedings Paper
Keywords

REGULARIZATION

Citation

NEUROCOMPUTING, v.35, pp.73 - 90

ISSN
0925-2312
URI
http://hdl.handle.net/10203/10966
Appears in Collection
EE-Journal Papers(저널논문)
Files in This Item
This item is cited by other documents in WoS
⊙ Detail Information in WoSⓡ Click to see webofscience_button
⊙ Cited 23 items in WoS Click to see citing articles in records_button

qr_code

  • mendeley

    citeulike


rss_1.0 rss_2.0 atom_1.0