Study on the new learning algorithm and stability of feedforward and diagonal recurrent neural networks = 전방향 및 재귀적 신경회로망의 새로운 학습 알고리즘과 안정성에 대한 연구

Cited 0 time in webofscience Cited 0 time in scopus
  • Hit : 297
  • Download : 0
Neural networks are parallel computational models comprised of densely interconnected adaptive processing units. These networks are parallel implementations of nonlinear static or dynamic systems. A very important feature of these networks is their adaptive nature, where learning by example replaces programming in solving problems. This feature makes such computational models very appealing in application domains where one has little or incomplete understanding of the problem to be solved but where training data is readily available. Another feature is the intrinsic parallel architecture that allows for fast computation of solutions when these networks are implemented on parallel digital computers or, ultimately, when implemented in customized hardware. Neural networks are viable computational models for a wide variety of problems. These include pattern classification, speech synthesis and recognition, adaptive interfaces between humans and complex physical systems, function approximation, image compression, associative memory, clustering, forecasting and prediction, combinatorial optimization, nonlinear system modeling, and control. These networks are neural in the sense that they may have been inspired by neuroscience but not necessarily because they are faithful models of biologic neural or cognitive phenomena. There are a number of kinds of neural networks, such as a Hopfield network, a multilayer perceptron, a radial basis function network, an so on. In this thesis, we have presented the learning algorithms of the feedforward neural networks and the generalized diagonal recurrent neural networks, and analyzed convergence and stability property of these algorithms. In recent years, many researchers have studied feedforward multilayer neural networks quite extensively and various fruitful results have been obtained. In particular the feedforward neural network (FNN) with the backpropagation (BP) method proposed by Rumelhart and McClelland is one of the most ...
Advisors
Park, Dong-Joresearcher박동조researcher
Description
한국과학기술원 : 전기및전자공학과,
Publisher
한국과학기술원
Issue Date
1999
Identifier
150988/325007 / 000945088
Language
eng
Description

학위논문(박사) - 한국과학기술원 : 전기및전자공학과, 1999.2, [ v, 106 p. ]

Keywords

Convergence; Recurrent neural networks; Constrained optimization; Neural networks; Stability; 안정성; 수렴성; 재귀적 신경회로망; 제한 최적화; 신경회로망

URI
http://hdl.handle.net/10203/36487
Link
http://library.kaist.ac.kr/search/detail/view.do?bibCtrlNo=150988&flag=dissertation
Appears in Collection
EE-Theses_Ph.D.(박사논문)
Files in This Item
There are no files associated with this item.

qr_code

  • mendeley

    citeulike


rss_1.0 rss_2.0 atom_1.0