Large-scale optimization : a subgradient approach대규모최적화모형의 해법에 관한 연구:서브그래디언트 접근방법

Cited 0 time in webofscience Cited 0 time in scopus
  • Hit : 495
  • Download : 0
Most large scale problems have special structures. A linear programming problem may have a block structure and a relatively small number of interaction between the subunits, some ``hard`` combinatorial problems can also be viewed as ``easy`` problems complicated by a relatively small set of side constraints. Efforts to take advantage of such structures often result in the formulation of nondifferentiable optimization (NDO) problems. It is time-consuming to evaluate the objective function of the induced NDO problems even if the number of variables is small. Until now the subgradient method has been used most favorably to solve the induced NDO problems. The convergence speed of the subgradient method is, however, extremely slowed down at the region where the gradient of the objective function varies rapidly or is discontinuous. This problem becomes especially serious when one is solving an induced problem. To resolve this problem, Poljak and Camerini et al. have suggested in their improved subgradient method and modified gradient method to take a suitable combination of the subgradients identified during the process and to use it as the search direction. Their suggestion coincides with the current trend of NDO and it was quite obvious that their algorithm would work better. But they failed to provide concrete theoretical arguments that their algorithms are superior to the subgradient method. Moreover, the algorithms require the optimal value of the problem to be known in advance, making it impossible to apply them to the induced NDO problems. For these reasons the algorithm have not been noticed. In this thesis we show theoretically that the improved subgradient method and the modified gradient method are superior to the subgradient method. That is, we whow that the iterate produced by the methods is closer to the optimal solution that that produced by the ordinary subgradient method. We also show that the direction of the methods forms a smaller angle with the d...
Advisors
Kim, Se-Hunresearcher김세헌researcher
Description
한국과학기술원 : 경영과학과,
Publisher
한국과학기술원
Issue Date
1988
Identifier
61225/325007 / 000805008
Language
eng
Description

학위논문(박사) - 한국과학기술원 : 경영과학과, 1988.2, [ v, 129 p. ]

URI
http://hdl.handle.net/10203/43663
Link
http://library.kaist.ac.kr/search/detail/view.do?bibCtrlNo=61225&flag=dissertation
Appears in Collection
MG-Theses_Ph.D.(박사논문)
Files in This Item
There are no files associated with this item.

qr_code

  • mendeley

    citeulike


rss_1.0 rss_2.0 atom_1.0