350 rub
Journal Highly available systems №3 for 2022 г.
Article in number:
Optimization of the order of hyperparameters of computational cluster by the ant colony method
Type of article: scientific article
DOI: https://doi.org/10.18127/j20729472-202203-02
UDC: 519.6
Authors:

I.N. Sinitsyn1, Yu.P. Titov2

1,2 FIC «Informatics and Management» RAS (Moscow, Russia),

1,2 Moscow Aviation Institute (Moscow, Russia)

Abstract:

Modern methods for studying high availability systems rely on complex analytical and simulation models. The task of such models is to determine the values of the criteria under various system parameters in order to find rational or optimal parameters. The optimization process is usually performed by the user or developer of the system. Often such tasks are computed on clusters, with the cluster being fed with a set of values, a set of parameter sets, which the cluster considers sequentially.

The article considers a possibility to apply a modification of ant colony method to optimize the order of parameter sets from a set in order to find rational solutions on a cluster more quickly.

A modification of ant colony method, which allows to carry out directed search of parameter values, has been proposed. For this modification the structure and the software realization of the parametrical graph and use of hash tables in algorithm have been created. We show the ways to improve quality of the ants colony method: pheromone is not put on the graph for ants that found already considered path; the parametric graph is transferred to initial state if all ants at iteration have not found new solutions; we use additional parameter - number of ant visits to vertex of graph, at probabilistic definition of ant's movement path.

We show the possibility of using the ant-colony method to optimize the order of hyperparameters of the winning cluster. The proposed modifications of the algorithm have shown their efficiency. We have developed recommendations for applied methods and values of parameters of the ant colony algorithm. Methods for software implementation of the algorithm on a computer have been proposed.

Pages: 23-37
For citation

Sinitsyn I.N., Titov Yu.P. Optimization of the order of hyperparameters of computational cluster by the ant colony method. Highly Available Systems / Sistemy vysokoy dostupnosti. 2022. V. 18. № 3. P. 23−37. DOI: https://doi.org/10.18127/j20729472-202203-02 (in Russian)

References
  1. Colorni A., Dorigo M., Maniezzo V. Distributed Optimization by Ant Colonies. Proc. First Eur. Conf. on Artific. Life. Paris. France. F.Varela and P.Bourgine (Eds.). Elsevier Publishing. P. 134−142. 1992.
  2. Dorigo M., St¨ utzle, T. Ant Colony Optimization. MIT Press. 2004. P. 321.
  3. Joseph M. Pasia Richard F. Hartl Karl F. Doerner. Solving a Bi-objective Flowshop Scheduling Problem by Pareto-Ant Colony Optimization M. Dorigo et al. (Eds.). ANTS 2006. LNCS 4150. 2006. P. 294−305.
  4. Nicolas Pinto, David Doukhan, James J. DiCarlo and David D. Cox. A high-throughput screening approach to discovering good forms of biologically inspired visual representation. PLoS Comput Biol. 2009. 5(11):e1000579. https://doi.org/10.1371/journal.pcbi.1000579.
  5. Coates A., Ng A., Lee, H. (2011). An Analysis of Single-Layer Networks in Unsupervised Feature Learning. Proceedings of the Fourteenth International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research (Otkrytyi dostup 24.08.2022) https://proceedings.mlr.press/v15/coates11a.html.
  6. A. Coates and A.Y. Ng. The importance of encoding versus training with sparse coding and vector quantization. ICML'11: Proceedings of the 28th International Conference on International Conference on Machine Learning. June 2011. P. 921−928.
  7. James Bergstra, Remi Bardenet, Remi Bardenet, Balazs Kegl. Algorithms for Hyper-Parameter Optimization (Otkrytyi dostup 24.08.2022: https://proceedings.neurips.cc/paper/2011/file/86e8f7ab32cfd12577bc2619bc635690-Paper.pdf).
  8. Feurer M., Hutter F. Hyperparameter Optimization. In: Hutter F., Kotthoff L., Vanschoren J. (eds) Automated Machine Learning. The Springer Series on Challenges in Machine Learning. Springer, Cham. 2019. https://doi.org/10.1007/978-3-030-05318-5_1.
  9. Koehrsen Will. A conceptual explanation of bayesian hyperparameter optimization for machine learning. 2018. (Otkrytyi dostup 24.08.2022: https://towardsdatascience.com/a-conceptual-explanation-of-bayesian-model-based-hyperparameter-optimization-for-machine-learning-b8172278050f).
  10. Bergstra James S., Rémi Bardenet, Yoshua Bengio and Balázs Kégl. Algorithms for hyper-parameter optimization. In Advances in neural information processing systems. 2011. P. 2546−2554.
  11. Akiba, Takuya, Shotaro Sano, Toshihiko Yanase, Takeru Ohta and Masanori Koyama. Optuna: A next-generation hyperparameter optimization framework. In Proceedings of the 25th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining. 2019. P. 2623−2631. https://doi.org/10.48550/arXiv.1907.10902.
  12. https://krasserm.github.io/2018/03/21/bayesian-optimization (Otkrytyi dostup 24.08.2022).
  13. https://krasserm.github.io/2018/03/19/gaussian-processes (Otkrytyi dostup 24.08.2022).
  14. https://towardsdatascience.com/a-conceptual-explanation-of-bayesian-model-based-hyperparameter-optimization-for-machine-learning-b8172278050f (Otkrytyi dostup 24.08.2022).
  15. Ian Dewancker, Michael McCourt, Scott Clark Bayesian Optimization Primer (Otkrytyi dostup 24.08.2022: https://static.sigopt.com/b/20a144d208ef255d3b981ce419667ec25d8412e2/static/pdf/SigOpt_Bayesian_Optimization_Primer.pdf).
  16. IBM Bayesian Optimization Accelerator 1.1 helps identify optimal product designs faster with breakthrough performance for scientific discovery and high-performance computing simulation (Otkrytyi dostup 24.08.2022: https://www.ibm.com/common/ssi/ShowDoc.wss?docURL=/common/ssi/rep_ca/6/877/ENUSZP20-0186/index.html&request_locale=en).
  17. Torry Tufteland(B), Guro Ødesneltvedt(B), and Morten Goodwin. Optimizing PolyACO Training with GPU-Based Parallelization M. Dorigo et al. (Eds.). ANTS 2016. LNCS 9882. P. 233−240. 2016. DOI: 10.1007/978-3-319-44427-7 20.
  18. Parpinelli R., Lopes H., Freitas A.: Data mining with an ant colony optimization algorithm. IEEE Trans. Evol. Comput. 2002. 6(4). P. 321−332.
  19. Bremer Jörg, Sebastian Lehnhoff. Constrained Scheduling of Step-Controlled Buffering Energy Resources with Ant Colony Optimization. ANTS Conference. 2020.
  20. Acevedo J., Maldonado S., Lafuente S., Gomez H., Gil P. Model Selection for Support Vector Machines Using Ant Colony Optimization in an Electronic Nose Application. Ant Colony Optimization and Swarm Intelligence. Dorigo M., Gambardella L.M., Birattari M., Martinoli A., Poli R., Stützle T. (eds). ANTS 2006. Lecture Notes in Computer Science. V. 4150. Springer, Berlin, Heidelberg. https://doi.org/10.1007/11839088_47.
  21. Parpinelli R., Lopes H., Freitas A.: Data mining with an ant colony optimization algorithm. IEEE Trans. Evol. Comput. 2002. 6(4). P. 321−332.
  22. Junior I.C.: Data mining with ant colony algorithms. ICIC 2013. LNCS. Huang, D.-S., Jo, K.- H., Zhou, Y.-Q., Han, K. (eds.). V. 7996. P. 30−38. Springer, Heidelberg. 2013.
  23. Martens D., De Backer M., Haesen R., Vanthienen J., Snoeck M., Baesens B. Classification with ant colony optimization. IEEE Trans. Evol. Comput. 2007. 11(5). P. 651−665.
  24. Karpenko A.P., Sinitsyn I.N. Bionika i sistemy vysokoi dostupnosti. Sistemy vysokoi dostupnosti. 2022. T. 18. № 2. S. 25−41. DOI: https://doi.org/ 10.18127/j20729472-202202-02. (in Russian)
  25. Sinitsyn I.N., Titov Yu.P. Razvitie stokhasticheskikh algoritmov muravinoi organizatsii. Sb. statei Pervoi Mezhdunar. nauchno-prakticheskoi konf. «Bionika – 60 let. Itogi i perspektivy». Pod red. A.P. Karpenko. 17−19 dekabrya 2021. M.: Assotsiatsiya tekhnicheskikh universitetov. 2022. S. 210−220. DOI: 10.53677/9785919160496_210_220. (in Russian)
  26. Khakhulin G.f. Titov Yu.P. Sistema podderzhki reshenii postavok zapasnykh chastei letatelnykh apparatov voennogo naznacheniya. Izvestiya Samarskogo nauchnogo tsentra Rossiiskoi akademii nauk. 2014. T. 16. № 1−5. S. 1619−1623. (in Russian)
  27. Titov Yu.P. Modifikatsii metoda muravinykh kolonii dlya resheniya zadach razrabotki aviatsionnykh marshrutov. Avtomatika i telemekhanika. 2015. vypusk 3. S. 108−124. (in Russian)
  28. Titov Yu.P. Modifikatsii metoda muravinykh kolonii dlya razrabotki programmnogo obespecheniya resheniya zadach mnogokriterialnogo upravleniya postavkami. Sovremennye informatsionnye tekhnologii i IT-obrazovanie. 2017. T. 13. № 2. S. 64−74. DOI 10.25559/SITITO.2017.2.222. (in Russian)
  29. Sudakov V.A., Batkovskii A.M., Titov Yu.P. Algoritmy uskoreniya raboty modifikatsii metoda muravinykh kolonii dlya poiska ratsionalnogo naznacheniya sotrudnikov na zadachi s nechetkim vremenem vypolneniya. Sovremennye informatsionnye tekhnologii i IT-obrazovanie. 2020. T. 16. № 2. S. 338−350. doi:10.25559/SITITO.16.202002.338-350. (in Russian)
  30. Sinitsyn I.N., Titov Yu.P. Instrumentalnoe programmnoe obespechenie analiza i sinteza stokhasticheskikh sistem vysokoi dostupnosti (XV). Sistemy vysokoi dostupnosti. 2021. T. 17. № 4. S. 24−33. DOI 10.18127/j20729472-202104-02. – EDN YEGVMR. (in Russian)
Date of receipt: 11.08.2022
Approved after review: 19.08.2022
Accepted for publication: 29.08.2022