350 rub
Journal Highly available systems №1 for 2023 г.
Article in number:
Investigation of algorithms for cyclic search for additional solutions when optimizing the order of hyperparameters by the ant colony method
Type of article: scientific article
DOI: https://doi.org/10.18127/j20729472-202301-05
UDC: 519.6
Authors:

I.N. Sinitsyn1, Yu.P. Titov2 

1,2 FIC «Informatics and Management» RAS (Moscow, Russia)
1,2 Moscow Aviation Institute (Moscow, Russia)
 

Abstract:

The application of the ant colony method involves finding a rational solution to which agents will converge. But it is possible to apply this method to the problem of directed enumeration, since the probabilistic path selection approach well determines the set of good solutions. The paper considers modifications of the ant colony method that allow finding new solutions if the agent at the iteration has found a solution already considered on the computing cluster. An algorithm for repeated cyclic search for a new, not considered, solution and the search for such a solution using the tree traversal procedure were considered. The limitation of the work of the ant colony method was a certain number of iterations considered. If the established restriction does not allow considering absolutely all solutions, combinations of parameters, then the use of tree traversal allows you to quickly find new solutions. If the constraint allows us to consider all solutions, then a repeated cyclic search for a new solution is more efficient. The proposed algorithms provide the ability to find a new solution for each agent while iterating over hyperparameters and provide the ability to accurately determine the stopping criterion for the algorithm associated with a certain number of iterations. At the same time, at the early stages of work, the proposed modification does not require large time expenditures to search for additional iterations. Recommendations have been developed on the choice of modifications of the ant colony algorithm that provide the best speed and accuracy of work.

Pages: 59-73
For citation

Sinitsyn I.N., Titov Yu.P. Investigation of algorithms for cyclic search for additional solutions when optimizing the order of hyperparameters by the ant colony method. Highly Available Systems. 2023. V. 19. № 1. P. 59−73. DOI: https://doi.org/ 10.18127/j20729472-202301-05 (in Russian)

References
  1. Colorni A., Dorigo M., Maniezzo V. Distributed Optimization by Ant Colonies // Proc. First Eur. Conf. on Artific. Life, Paris, France, F. Varela and P. Bourgine (Eds.), Elsevier Publishing. 1992. Р. 134–142.
  2. Dorigo M., St¨ utzle, T.: Ant Colony Optimization // MIT Press. 2004. p. 321.
  3. Joseph M. Pasia, Richard F. Hartl, Karl F. Doerner. Solving a Bi-objective Flowshop Scheduling Problem by Pareto-Ant Colony Op-timization M. Dorigo et al. (Eds.) // ANTS 2006, LNCS 4150. 2006. Р. 294–305.
  4. Torry Tufteland(B), Guro Ødesneltvedt(B), and Morten Goodwin Optimizing PolyACO Training with GPU-Based Parallelization M. Dorigo et al. (Eds.) // ANTS 2016, LNCS 9882. 2016. Р. 233–240. DOI: 10.1007/978-3-319-44427-7 20
  5. Parpinelli R., Lopes H., Freitas A. Data mining with an ant colony optimization algorithm // IEEE Trans. Evol. Comput. 2002. 6(4). Р. 321–332.
  6. Bremer Jörg and Sebastian Lehnhoff. Constrained Scheduling of Step-Controlled Buffering Energy Resources with Ant Colony Op-timization // ANTS Conference. 2020.
  7. A. Coates and A.Y. Ng. The importance of encoding versus training with sparse coding and vector quantization // ICML'11: Proceedings of the 28th International Conference on International Conference on Machine LearningJune 2011. P. 921–928.
  8. Feurer M., Hutter F. Hyperparameter Optimization. In: Hutter F., Kotthoff L., Vanschoren J. (eds) Automated Machine Learning // The Springer Series on Challenges in Machine Learning. Springer, Cham. 2019. https://doi.org/10.1007/978-3-030-05318-5_1
  9. Koehrsen Will. A conceptual explanation of bayesian hyperparameter optimization for machine learning. 2018 (Otkrytyj dostup 23.12.2022: https://towardsdatascience.com/a-conceptual-explanation-of-bayesian-model-based-hyperparameter-optimization-forma­chine-learning-b8172278050f)
  10. Bergstra James S., Rémi Bardenet, Yoshua Bengio and Balázs Kégl. Algorithms for hyper-parameter optimization // In Advances in neural information processing systems. 2011. Р. 2546–2554.
  11. Akiba, Takuya, Shotaro Sano, Toshihiko Yanase, Takeru Ohta, and Masanori Koyama. Optuna: A next-generation hyperparameter optimization framework // In Proceedings of the 25th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining. 2019. Р. 2623–2631. https://doi.org/10.48550/arXiv.1907.10902
  12. https://krasserm.github.io/2018/03/21/bayesian-optimization (Otkrytyj dostup 23.12.2022)
  13. https://krasserm.github.io/2018/03/19/gaussian-processes (Otkrytyj dostup 23.12.2022)
  14. https://towardsdatascience.com/a-conceptual-explanation-of-bayesian-model-based-hyperparameter-optimization-for-machine-lear­ning-b8172278050f (Otkrytyj dostup 23.12.2022)
  15. Ian Dewancker, Michael McCourt, Scott Clark Bayesian Optimization Primer (Otkrytyj dostup 23.12.2022: https://static.sigopt.com/b/ 20a144d208ef255d3b981ce419667ec25d8412e2/static/pdf/SigOpt_Bayesian_Optimization_Primer.pdf)
  16. IBM Bayesian Optimization Accelerator 1.1 helps identify optimal product designs faster with breakthrough performance for scientific discovery and high-performance computing simulation (Otkrytyj dostup 23.12.2022: https://www.ibm.com/common/ssi/ ShowDoc.wss?docURL=/common/ssi/rep_ca/6/877/ENUSZP20-0186/index.html&request_locale=en)
  17. Martens D., De Backer M., Haesen R., Vanthienen J., Snoeck M., Baesens B. Classification with ant colony optimization // IEEE Trans. Evol. Comput. 2007. 11(5). R. 651–665.
  18. Sinicyn I.N., Titov YU.P. Optimizaciya poryadka sledovaniya giperparametrov vychislitel'nogo klastera metodom murav'inyh kolonij // Sistemy vysokoj dostupnosti. 2022. T. 18. № 3. S. 23–37. DOI 10.18127/j20729472-202203-02
  19. Karpenko A.P., Sinicyn I.N. Bionika i sistemy vysokoj dostupnosti // Sistemy vysokoj dostupnosti. 2022. T. 18. № 2. S. 25−41. DOI: https://doi.org/ 10.18127/j20729472-202202-02
  20. Sinicyn I.N., Titov YU.P. Razvitie stohasticheskih algoritmov murav'inoj organizacii // Bionika – 60 let. Itogi i perspektivy // Sb. statej Pervoj Mezhdunar. nauchno-prakt. konf. 17–19 dekabrya 2021 g., Moskva / Pod red. A.P. Karpenko. M.: Associaciya tekhnicheskih universitetov. 2022. S. 210–220. DOI: 10.53677/9785919160496_210_220
  21. Hahulin G.f. Titov, YU.P. Sistema podderzhki reshenij postavok zapasnyh chastej letatel'nyh apparatov voennogo naznacheniya // Izvestiya Samarskogo nauchnogo centra Rossijskoj akademii nauk. 2014. T. 16. № 1-5. S. 1619–1623.
  22. Titov YU.P. Modifikacii metoda murav'inyh kolonij dlya razrabotki programmnogo obespecheniya resheniya zadach mnogokriterial'nogo upravleniya postavkami // Sovremennye informacionnye tekhnologii i IT-obrazovanie. 2017. T. 13. № 2. S. 64–74. DOI 10.25559/SITITO.2017.2.222
  23. Sudakov V.A., Bat'kovskij A.M., Titov YU.P. Algoritmy uskoreniya raboty modifikacii metoda murav'inyh kolonij dlya poiska racional'nogo naznacheniya sotrudnikov na zadachi s nechetkim vremenem vypolneniya // Sovremennye informacionnye tekhnologii i IT-obrazovanie. 2020. T. 16. № 2. S. 338–350. doi:10.25559/SITITO.16.202002.338-350
  24. Sinicyn I.N., Titov YU.P. Instrumental'noe programmnoe obespechenie analiza i sinteza stohasticheskih sistem vysokoj dostupnosti (XV) // Sistemy vysokoj dostupnosti. 2021. T. 17. № 4. S. 24–33. DOI 10.18127/j20729472-202104-02. – EDN YEGVMR.
Date of receipt: 13.02.2023
Approved after review: 27.02.2023
Accepted for publication: 01.03.2023