A hybrid optimization method of evolutionary and gradient search

Cited 16 time in webofscience Cited 0 time in scopus
  • Hit : 463
  • Download : 0
DC FieldValueLanguage
dc.contributor.authorTahk, Min-Jeako
dc.contributor.authorWoo, Hyun-Wookko
dc.contributor.authorPark, Moon-Suko
dc.date.accessioned2013-03-08T01:09:28Z-
dc.date.available2013-03-08T01:09:28Z-
dc.date.created2012-02-06-
dc.date.created2012-02-06-
dc.date.issued2007-01-
dc.identifier.citationENGINEERING OPTIMIZATION, v.39, no.1, pp.87 - 104-
dc.identifier.issn0305-215X-
dc.identifier.urihttp://hdl.handle.net/10203/91676-
dc.description.abstractThis article proposes a hybrid optimization algorithm, which combines evolutionary algorithms (EA) and the gradient search technique, for optimization with continuous parameters. Inheriting the advantages of the two approaches, the new method is fast and capable of global search. The main structure of the new method is similar to that of EA except that a special individual called the gradient individual is introduced and EA individuals are located symmetrically. The gradient individual is propagated through generations by means of the quasi-Newton method. Gradient information required for the quasi-Newton method is calculated from the costs of EA individuals produced by the evolution strategies (ES). The symmetric placement of the individuals with respect to the best individual is for calculating the gradient vector by the central difference method. For the estimation of the inverse Hessian matrix, symmetric Rank-1 update shows better performance than BFGS and DFP. Numerical tests on various benchmark problems and a practical control design example demonstrate that the new hybrid algorithm gives a faster convergence rate than EA, without sacrificing the capability of global search.-
dc.languageEnglish-
dc.publisherTAYLOR & FRANCIS LTD-
dc.subjectRANK-ONE UPDATE-
dc.subjectLOCAL SEARCH-
dc.subjectMEMETIC ALGORITHMS-
dc.subjectGENETIC ALGORITHM-
dc.subjectDESIGN-
dc.subjectMODEL-
dc.subjectCOST-
dc.titleA hybrid optimization method of evolutionary and gradient search-
dc.typeArticle-
dc.identifier.wosid000242937600006-
dc.identifier.scopusid2-s2.0-33845677749-
dc.type.rimsART-
dc.citation.volume39-
dc.citation.issue1-
dc.citation.beginningpage87-
dc.citation.endingpage104-
dc.citation.publicationnameENGINEERING OPTIMIZATION-
dc.identifier.doi10.1080/03052150600957314-
dc.contributor.localauthorTahk, Min-Jea-
dc.contributor.nonIdAuthorWoo, Hyun-Wook-
dc.contributor.nonIdAuthorPark, Moon-Su-
dc.type.journalArticleArticle-
dc.subject.keywordAuthorhybrid algorithms-
dc.subject.keywordAuthorevolutionary algorithms-
dc.subject.keywordAuthorquasi-Newton method-
dc.subject.keywordAuthorevolution strategies-
dc.subject.keywordPlusRANK-ONE UPDATE-
dc.subject.keywordPlusLOCAL SEARCH-
dc.subject.keywordPlusMEMETIC ALGORITHMS-
dc.subject.keywordPlusGENETIC ALGORITHM-
dc.subject.keywordPlusDESIGN-
dc.subject.keywordPlusMODEL-
dc.subject.keywordPlusCOST-
Appears in Collection
AE-Journal Papers(저널논문)
Files in This Item
There are no files associated with this item.
This item is cited by other documents in WoS
⊙ Detail Information in WoSⓡ Click to see webofscience_button
⊙ Cited 16 items in WoS Click to see citing articles in records_button

qr_code

  • mendeley

    citeulike


rss_1.0 rss_2.0 atom_1.0