An Improved Hybrid Genetic Algorithm with a New Local Search Procedure
One important challenge of a hybrid genetic algorithm (HGA) (also called memetic algorithm) is the tradeoff between global and local searching (LS) as it is the case that the cost of an LS can be rather high. This paper proposes a novel, simplified, and efficient HGA with a new individual learning p...
Main Authors: | , |
---|---|
Format: | Article |
Language: | English |
Published: |
Hindawi Limited
2013-01-01
|
Series: | Journal of Applied Mathematics |
Online Access: | http://dx.doi.org/10.1155/2013/103591 |
id |
doaj-fe6fdd749fda4587942cf251b532c7cd |
---|---|
record_format |
Article |
spelling |
doaj-fe6fdd749fda4587942cf251b532c7cd2020-11-24T22:23:45ZengHindawi LimitedJournal of Applied Mathematics1110-757X1687-00422013-01-01201310.1155/2013/103591103591An Improved Hybrid Genetic Algorithm with a New Local Search ProcedureWen Wan0Jeffrey B. Birch1Department of Biostatistics, Virginia Commonwealth University, Richmond, VA 23298-0032, USADepartment of Statistics, Virginia Polytechnic Institute and State University, Blacksburg, VA 24061-0439, USAOne important challenge of a hybrid genetic algorithm (HGA) (also called memetic algorithm) is the tradeoff between global and local searching (LS) as it is the case that the cost of an LS can be rather high. This paper proposes a novel, simplified, and efficient HGA with a new individual learning procedure that performs a LS only when the best offspring (solution) in the offspring population is also the best in the current parent population. Additionally, a new LS method is developed based on a three-directional search (TD), which is derivative-free and self-adaptive. The new HGA with two different LS methods (the TD and Neld-Mead simplex) is compared with a traditional HGA. Four benchmark functions are employed to illustrate the improvement of the proposed method with the new learning procedure. The results show that the new HGA greatly reduces the number of function evaluations and converges much faster to the global optimum than a traditional HGA. The TD local search method is a good choice in helping to locate a global “mountain” (or “valley”) but may not perform the Nelder-Mead method in the final fine tuning toward the optimal solution.http://dx.doi.org/10.1155/2013/103591 |
collection |
DOAJ |
language |
English |
format |
Article |
sources |
DOAJ |
author |
Wen Wan Jeffrey B. Birch |
spellingShingle |
Wen Wan Jeffrey B. Birch An Improved Hybrid Genetic Algorithm with a New Local Search Procedure Journal of Applied Mathematics |
author_facet |
Wen Wan Jeffrey B. Birch |
author_sort |
Wen Wan |
title |
An Improved Hybrid Genetic Algorithm with a New Local Search Procedure |
title_short |
An Improved Hybrid Genetic Algorithm with a New Local Search Procedure |
title_full |
An Improved Hybrid Genetic Algorithm with a New Local Search Procedure |
title_fullStr |
An Improved Hybrid Genetic Algorithm with a New Local Search Procedure |
title_full_unstemmed |
An Improved Hybrid Genetic Algorithm with a New Local Search Procedure |
title_sort |
improved hybrid genetic algorithm with a new local search procedure |
publisher |
Hindawi Limited |
series |
Journal of Applied Mathematics |
issn |
1110-757X 1687-0042 |
publishDate |
2013-01-01 |
description |
One important challenge of a hybrid genetic algorithm (HGA) (also called memetic algorithm) is the tradeoff between global and local searching (LS) as it is the case that the cost of an LS can be rather high. This paper proposes a novel, simplified, and efficient HGA with a new individual learning procedure that performs a LS only when the best offspring (solution) in the offspring population is also the best in the current parent population. Additionally, a new LS method is developed based on a three-directional search (TD), which is derivative-free and self-adaptive. The new HGA with two different LS methods (the TD and Neld-Mead simplex) is compared with a traditional HGA. Four benchmark functions are employed to illustrate the improvement of the proposed method with the new learning procedure. The results show that the new HGA greatly reduces the number of function evaluations and converges much faster to the global optimum than a traditional HGA. The TD local search method is a good choice in helping to locate a global “mountain” (or “valley”) but may not perform the Nelder-Mead method in the final fine tuning toward the optimal solution. |
url |
http://dx.doi.org/10.1155/2013/103591 |
work_keys_str_mv |
AT wenwan animprovedhybridgeneticalgorithmwithanewlocalsearchprocedure AT jeffreybbirch animprovedhybridgeneticalgorithmwithanewlocalsearchprocedure AT wenwan improvedhybridgeneticalgorithmwithanewlocalsearchprocedure AT jeffreybbirch improvedhybridgeneticalgorithmwithanewlocalsearchprocedure |
_version_ |
1725764104646295552 |