artificial neural networks
particle swarm optimization
parallel island algorithm
L. G. Komartsova, D. S. Kadnikov
Designing artificial neural network for task of recognition and classification is a complex problem. Main problems are related with need of constructing own ANN for every single problem. That is why the study of new algorithms for neural network topology and other parameters optimization is very important. At present genetic algorithms (GA) are implemented widely for solving many similar problems, which are hard to solve efficiently with other methods. Using hybrid and parallel GA’s allow increasing the quality of solution and/or decreasing time of its finding. So the study of efficiency of such algorithms to select specific parameters values that provides best solution finding proposes a great value.
In the article, the hybrid optimization algorithm is studied, based on genetic algorithm and particle swarm optimization method, which is designed to selection of multilayer perceptron parameters, giving smaller recognition error on test problems compared to other known algorithms. Problems of ANN synthesis are described; criteria of synthesized NN evaluation are given. Main ideas of Particle Swarm Optimization method and Hybrid algorithm are described. Distinct specialty of Hybrid Algorithm is modeling of individual maturing phenomenon in nature, i.e. acquirement and accumulation experience from interaction with environment and other individuals from population. In discussed hybrid algorithm, before implementation of usual genetic operators and forming new population, the new operator Enhance is implemented to chromosomes, which models this phenomenon.
Hence taking parameters those effects on changing of search direction of optimal solution finding (such as current motion, own memory of particle, influence from swarm) into account is realized for each iteration. Inner parallelism is originally inherited for genetic algorithms. One of the paralleling methods is simultaneous progress of a number of populations. Interaction is implemented through mechanism of migration. Parallel algorithm, its features and parameters are described.
Efficiency of studied algorithms was tested on 2 known test problems such as Iris classification (3 classes, 3 parameters) and Wine classification (3 classes, 13 parameters). Recognition error of ANN must be minimal. Topology of used ANN for this recognition tasks is presented. Fitness function is percent of recognized instances by ANN from set of test instances.
Comparative results of studied algorithms testing are presented. For reliable results obtaining and lowering influence of probabilistic parameters each algorithm was started 100 times and gotten values were averaged. Changing the fitness average value is shown.
Genetic algorithms us showed their efficiency for solving ANN learning problem. Compared to back propagation (BP) classic algorithm GAs allow to decrease time of solution receiving, especially for tasks with great number of parameters, and to keep out from local extremum.
Parallel GA shows good dynamic of suboptimal solution finding and has better time rates compared to HGAPSO on considered tasks. Simple GA has high working speed and allows finding rather good solutions with relatively small amount of time, but its results highly varied from start to start and average results are not very good. Parallel and hybrid GAs show ability to best quality solutions finding, but demand more computational resources.