350 rub
Journal Neurocomputers №4 for 2025 г.
Article in number:
Implementation of the ability to recognize flying objects in a video stream using a neural network
Type of article: scientific article
DOI: https://doi.org/10.18127/j19998554-202504-01
UDC: 004.932.2
Authors:

A.O. Kasyanov1, L.A. Podkolzina2
1 Southern Federal University (Rostov-on-Don, Russia)
2 Don State Technical University (Rostov-on-Don, Russia)

1 kasao@mail.ru, 2 lubov.pod@yandex.ru

Abstract:

Multiagent systems are used in different fields of life: industrial object of automation, emergency works, research of Earth and others. Timely detection of object or situations at the earliest possible stage becomes a critical factor for successful minimization of damage and saving lives in the context of solving the tasks of emergency services. However, for work on large areas it is necessary to use a multiagent system contain groups of moving objects, which entails the need to manage such a group and to coordinate its movement. Despite the existence of algorithms for recognizing moving objects (MO) in real time, there is a need for a software implementation of the approach to detecting a MO in real time

Initially several methods of MO alignment have been analyzed taking into account their specificity. The convolutional architecture of the YOLOv8 neural network has been selected for MO recognition. Tree variants of models (nano, small and medium) have been selected for work.

Then the process of implementing the functional software has been explained. A dataset has been formed and additional training of the neural network has been carried out. The data set consisted of 27,000 images. The algorithm for processing streaming video coming through the channel has been selected taking into account the use of the Nvidia GTX 3080 graphics processor, which ensured a high speed of processing incoming video stream frames. To provide operation in real time, the YOLOv8 small neural network additionally trained on the formed data set has been selected and used. The network inference was 28 ms, the accuracy – 96.18%, recall – 93.66%, mAP0.5 – 95.52%.

The functional software developed within the framework of the work allows the transmission of data for adjusting the direction of movement of the MO in real time and can be used to organize coordinated MO groups.

Pages: 5-16
For citation

Kasyanov A.O., Podkolzina L.A. Implementation of the ability to recognize flying objects in a video stream using a neural network. Neurocomputers. 2025. V. 27. № 4. P. 5–16. DOI: https://doi.org/10.18127/j19998554-202504-01 (in Russian)

References
  1. Budaev E.S., Mikhajlova S.S., Evdokimova I.S., Khalmakshinov E.A. Razrabotka nejrosetevoj modeli obnaruzheniya ob''ektov v videopotoke. Nejrokomp'yutery: razrabotka, primenenie. 2023. T. 25. № 4. S. 54–64. DOI: https://doi.org/10.18127/jl9998554-202304-07. (in Russian)
  2. Blackman S.S. Multiple hypothesis tracking for multiple target tracking. IEEE Aerospace and Electronic Systems Magazine. 2006. V. 19. № 1. P. 5–18.
  3. Metody komp'yuternoj obrabotki izobrazhenij. Pod red. V.A. Sojfera. M.: Fizmatlit. 2001. (in Russian)
  4. Forsajt D., Pons Zh. Komp'yuternoe zrenie. Sovremennyj podkhod. M.: ID Vil'yams. 2004. (in Russian)
  5. Boguslavskij A.A., Sokolov S.M., Fyodorov N.G., Vinogradov P.V. Sistema tekhnicheskogo zreniya dlya informatsionnogo obespecheniya avtomaticheskoj posadki i dvizheniya po VPP letatel'nykh apparatov. Izvestiya YuFU. Tekhnicheskie nauki. 2015. № 1 (162). S. 96–109. (in Russian)
  6. Svidetel'stvo o registratsii programmy dlya EVM № 2016613671 RF. Sistema soprovozhdeniya tseli. A.P. Shvedov, M.I. Prejskurantova. Opubl. 01.04.2016. (in Russian)
  7. Tikhonov K.M., Tishkov V.V. Razrabotka modeli programmnogo korrektiruemogo soprovozhdeniya nazemnoj tseli s uchetom vozmozhnostej cheloveka-operatora. Vestnik Moskovskogo aviatsionnogo instituta. 2011. T. 18. № 6. S. 68–77. (in Russian)
  8. Voronin V.V., Sizyakin R.A., Zhdanova M. et al. Automated visual inspection of fabric image using deep learning approach for defect detection. Automated Visual Inspection and Machine Vision IV. 2021. V. 11787. P. 117870.
  9. Alfimtsev A.N., Lychkov I.I. Metod obnaruzheniya ob''ekta v videopotoke v real'nom vremeni. Vestnik Tambovskogo gosudarstvennogo tekhnicheskogo universiteta. 2011. T. 17. № 1. S. 44–55. (in Russian)
  10. Mirzoyan A.S., Malyshev O.V., Khmarov I.M., Kanivets V.Yu. Raspoznavanie letatel'nykh apparatov opticheskoj sistemoj v real'nom masshtabe vremeni. Vestnik Moskovskogo aviatsionnogo instituta. 2014. T. 21. № 5. S. 145–156. (in Russian)
  11. Blackman S.S. Multiple hypothesis tracking for multiple target tracking. IEEE Aerospace and Electronic Systems Magazine. 2004. V. 19. № 1. P. 5–18. DOI: 10.1109/MAES.2004.1263228.
  12. Alpatov B.A., Murav'ev V.S., Murav'ev S.I. Obrabotka i analiz izobrazhenij v sistemakh avtomaticheskogo obnaruzheniya i soprovozhdeniya vozdushnykh ob''ektov. Ryazan': RGRTU. 2012.
  13. Akinshin R.N., Khomyakov A.V., Polubekhin A.I., Rumyantsev V.L. Algoritm soprovozhdeniya manevriruyushchikh vozdushnykh tselej v mnogopozitsionnoj radiolokatsionnoj sisteme. Izvestiya Tul'skogo gosudarstvennogo universiteta. Tekhnicheskie nauki. 2021. № 2. S. 175–183. (in Russian)
  14. Makarenko S.I. Protivodejstvie bespilotnym letatel'nym apparatam: Monografiya. SPb.: Naukoemkie tekhnologii. 2020. (in Russian)
  15. Ultralytics – YOLOv8 [Elektronnyj resurs]. URL: https://docs/ultralytics.com/.
  16. Khan M.U., Dil M., Misbah M. et al. TransLearn-YOLOx: Improved-YOLO with transfer learning for fast and accurate multiclass UAV detection. 2023 International Conference on Communication, Computing and Digital Systems (C-CODE). Islamabad, Pakistan. 2023. P. 1–7. DOI: 10.1109/C-CODE58145.2023.10139896.
  17. Yilmaz B., Kutbay U. YOLOv8-based drone detection: Performance analysis and optimization. Computers. 2024. V. 13. № 9. P. 234.
  18. Naveen H., Menon P., Vinitha V. et al. A study on YOLOv5 for drone detection with Google colab training. 2023 2nd International Conference on Automation, Computing and Renewable Systems (ICACRS). Pudukkottai, India. 2023. P. 1576–1580. DOI: 10.1109/ICACRS58579. 2023.10404797.
  19. Purbaditya B., Nowak P. Drone detection and tracking with YOLO and a rule-based method. 2025. DOI: 10.48550/arXiv.2502.05292.
  20. Roboflow [Elektronnyj resurs]. URL: https://roboflow.com/.
Date of receipt: 16.06.2025
Approved after review: 08.07.2025
Accepted for publication: 28.07.2025