A Novel Fractional Gradient-Based Learning Algorithm for Recurrent Neural Networks

Cited 28 time in webofscience Cited 0 time in scopus
  • Hit : 337
  • Download : 0
In this research, we propose a novel algorithm for learning of the recurrent neural networks called as the fractional back-propagation through time (FBPTT). Considering the potential of the fractional calculus, we propose to use the fractional calculus-based gradient descent method to derive the FBPTT algorithm. The proposed FBPTT method is shown to outperform the conventional back-propagation through time algorithm on three major problems of estimation namely nonlinear system identification, pattern classification and Mackey-Glass chaotic time series prediction.
Publisher
SPRINGER BIRKHAUSER
Issue Date
2018-02
Language
English
Article Type
Article
Keywords

SYSTEMS; BACKPROPAGATION; CALCULUS; TIME; IDENTIFICATION; PREDICTION; MODELS; SERIES

Citation

CIRCUITS SYSTEMS AND SIGNAL PROCESSING, v.37, no.2, pp.593 - 612

ISSN
0278-081X
DOI
10.1007/s00034-017-0572-z
URI
http://hdl.handle.net/10203/240137
Appears in Collection
RIMS Journal Papers
Files in This Item
There are no files associated with this item.
This item is cited by other documents in WoS
⊙ Detail Information in WoSⓡ Click to see webofscience_button
⊙ Cited 28 items in WoS Click to see citing articles in records_button

qr_code

  • mendeley

    citeulike


rss_1.0 rss_2.0 atom_1.0