350 rub
Journal Highly available systems №2 for 2023 г.
Article in number:
Investigation of the possibility of obtaining all solutions by the method of ant colonies for the problem of optimizing the order of hyperparameters
Type of article: scientific article
DOI: https://doi.org/10.18127/j20729472-202302-05
UDC: 519.6
Authors:

I.N. Sinitsyn1, Yu.P. Titov2

1,2 FRC “Informatics and Computer Science” of the RAS (Moscow, Russia)
 

Abstract:

When investigating high availability systems, the task arises of determining the optimal or rational set of system parameters (hyperparameters). For such problems, it is possible to enumerate all variations of the sets of values of hyperparameters so as not to miss the optimal one. Rearranging the parameter values makes it possible to get rational solutions in the early stages and it is possible to stop the search for new solutions. A modification of the ant colony method is proposed to reorder sets of parameter values.

Purpose – to investigate the situation when most sets of parameter values are considered on the computational cell-master and the method of ant colonies, it is necessary to find missing solutions with a low probability of their appearance. You-work recommendations on the rules of behavior of the agent who found the already considered set of parameter values.

The "Carrom table function" bank with 48400 variants of parameter value sets is considered. For this bank, the best results were shown by the modification of the method of ant colonies with a repeated cyclic search for new solutions by the agent who found the already considered solution at the iteration. This algorithm, when changing additional information on the number of visits to the vertex in the probability formula, allows you not to limit the search for additional solutions and when considering 98% of solutions, requires less than 10 additional iterations. Modification using tree traversal, due to the lack of additional information, requires more before-full iterations and more time to search for a set of parameter values that has not yet been considered.

The developed and implemented modification of the ant colony method makes it possible to perform a directional re-collection of all sets of parameter values. The optimal set of parameters is found after considering 1% of all solutions. To further improve the algorithm, you can consider adding various additional information to the probability choice formula of the vertex by the agent.

Pages: 55-69
For citation

Sinitsyn I.N., Titov Yu.P. Investigation of the possibility of obtaining all solutions by the method of ant colonies for the problem of optimizing the order of hyperparameters. Highly Available Systems. 2023. V. 19. № 2. P. 55−69. DOI: https://doi.org/ 10.18127/j20729472-202302-05 (in Russian)

References
  1. Colorni A., Dorigo M., Maniezzo V. Distributed Optimization by Ant Colonies. Proc. First Eur. Conf. on Artific. Life, Paris, France. F. Varela, Bourgine  P. (Eds.). Elsevier Publishing. 1992. P. 134–142.
  2. Dorigo, M., St¨ utzle, T.: Ant Colony Optimization //MIT Press, p. 321, 2004
  3. Pasia J.M., Hartl R.F., Doerner K.F. Solving a Bi-objective Flowshop Scheduling Problem by Pareto-Ant Colony Optimization M. Dorigo et al. (Eds.). ANTS 2006. LNCS 4150. 2006. P. 294–305.
  4. Torry Tufteland (B), Guro Odesneltvedt (B), Morten Goodwin Optimizing PolyACO Training with GPU-Based Parallelization Dorigo M. et al. (Eds.). ANTS 2016. LNCS 9882. 2016. P. 233–240, DOI: 10.1007/978-3-319-44427-7 20
  5. Parpinelli R., Lopes H., Freitas A. Data mining with an ant colony optimization algorithm. IEEE Trans. Evol. Comput. 6(4). 2002. P. 321–332.
  6. Jörg B., Lehnhoff S. Constrained Scheduling of Step-Controlled Buffering Energy Resources with Ant Colony Optimization. ANTS Conference. 2020.
  7. Martens, D., De Backer, M., Haesen, R., Vanthienen, J., Snoeck, M., Baesens, B. Classification with ant colony optimization. IEEE Trans. Evol. Comput. 2007. V. 11(5). P. 651–665.
  8. Bergstra J.S., Rémi Bardenet, Yoshua Bengio, Balázs Kégl. Algorithms for hyper-parameter optimization. In Advances in neural information processing systems. 2011. P. 2546–2554.
  9. Akiba, Takuya, Shotaro Sano, Toshihiko Yanase, Takeru Ohta, Masanori Koyama. Optuna: A next-generation hyperparameter optimization framework. In Proceedings of the 25th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining. 2019. P. 2623–2631. https://doi.org/10.48550/arXiv.1907.10902
  10. Koehrsen W. A conceptual explanation of bayesian hyperparameter optimization for machine learning. 2018. (Открытый доступ 23.12.2022: https://towardsdatascience.com/a-conceptual-explanation-of-bayesian-model-based-hyperparameter-optimization-for-machine-learning-b8172278050f)
  11. Ian Dewancker, Michael McCourt, Scott Clark Bayesian Optimization Primer (Открытый доступ 23.12.2022: https://static.sigopt. com/b/20a144d208ef255d3b981ce419667ec25d8412e2/static/pdf/SigOpt_Bayesian_Optimization_Primer.pdf)
  12. IBM Bayesian Optimization Accelerator 1.1 helps identify optimal product designs faster with breakthrough performance for scientific discovery and high-performance computing simulation (Открытый доступ 23.12.2022: https://www.ibm.com/common/ssi/ShowDoc. wss?docURL=/common/ssi/rep_ca/6/877/ENUSZP20-0186/index.html&request_locale=en)
  13. Mishra, Sudhanshu K. Some New Test Functions for Global Optimization and Performance of Repulsive Particle Swarm Method. University Library of Munich, Germany, MPRA Paper. 10.2139/ssrn.926132. 2006
  14. Layeb, Abdesslem. New hard benchmark functions for global optimization. 2022
  15. Jamil, Momin, Yang, Xin-She. Benchmark functions. 2013.
  16. Sinicyn I.N., Titov Yu.P. Optimizaciya poryadka sledovaniya giperparametrov vychislitel'nogo klastera metodom murav'inyh kolonij. Sistemy vysokoj dostupnosti. 2022. T. 18. № 3. S. 23–37. DOI 10.18127/j20729472-202203-02 (in Russian).
  17. Karpenko A.P., Sinicyn I.N. Bionika i sistemy vysokoj dostupnosti. Sistemy vysokoj dostupnosti. 2022. T. 18. № 2. C. 25−41. DOI: https://doi.org/ 10.18127/j20729472-202202-02.
  18. Sinicyn I.N., Titov Yu.P. Razvitie stohasticheskih algoritmov murav'inoj organizacii. Bionika – 60 let. Itogi i perspektivy. Sbornik statej Pervoj Mezhdunarodnoj nauchno-prakticheskoj konferencii, 17–19 dekabrya 2021 goda, g. Moskva. Pod red. A.P. Karpenko. M.: Associaciya tekhnicheskih universitetov. C. 210-220. 2022. DOI: 10.53677/9785919160496_210_220 (in Russian).
  19. Hahulin G.F. Titov Yu.P. Sistema podderzhki reshenij postavok zapasnyh chastej letatel'nyh apparatov voennogo naznacheniya. Izv. Samarskogo nauchnogo centra Rossijskoj akademii nauk. 2014. T. 16. № 1–5. S. 1619–1623 (in Russian).
  20. Titov Yu.P. Modifikacii metoda murav'inyh kolonij dlya razrabotki programmnogo obespecheniya resheniya zadach mnogokriterial'nogo upravleniya postavkami. Sovremennye informacionnye tekhnologii i IT-obrazovanie. 2017. T. 13. № 2. S. 64–74. DOI 10.25559/SITITO.2017.2.222  (in Russian).
  21. Sudakov V.A., Bat'kovskij A.M., Titov Yu.P. Algoritmy uskoreniya raboty modifikacii metoda murav'inyh kolonij dlya poiska racional'nogo naznacheniya sotrudnikov na zadachi s nechetkim vremenem vypolneniya. Sovremennye informacionnye tekhnologii i IT-obrazovanie. 2020. T. 16. № 2. S. 338–350. doi:10.25559/SITITO.16.202002.338-350 (in Russian)
  22. Sinicyn I.N., Titov Yu.P. Instrumental'noe programmnoe obespechenie analiza i sinteza stohasticheskih sistem vysokoj dostupnosti (XV). Sistemy vysokoj dostupnosti. 2021. T. 17. № 4. S. 24–33. DOI 10.18127/j20729472-202104-02 (in Russian).
Date of receipt: 06.04.2023
Approved after review: 19.04.2023
Accepted for publication: 27.04.2023