350 rub
Journal Achievements of Modern Radioelectronics №10 for 2025 г.
Article in number:
Development of complex for protecting critical infrastructure facilities from unmanned aerial vehicles
Type of article: scientific article
DOI: https://doi.org/10.18127/j20700784-202510-08
UDC: 629.7.051; 004.415; 004.89
Authors:

D.N. Shevelev1, A.V. Roslyakov2, N.I. Smelov3, N.A. Zadorina4, A.N. Lomanov5

1–5 P.A. Solovyov Rybinsk State Aviation Technical University (Rybinsk, Russia)

1 i@d-shevelev.ru, 2 aroslykovit@yandex.ru, 3 thesmelov1@mail.ru, 4 zadorina@rsatu.ru, 5 lepss@yandex.ru

Abstract:

The rapid proliferation of commercial and do-it-yourself unmanned aerial vehicles (UAVs) has significantly increased the range of potential threats to critical infrastructure facilities. While drones are widely employed for civil applications such as aerial photography, agriculture, logistics, and environmental monitoring, their availability and technical advancement also enable malicious use, including reconnaissance, smuggling, sabotage, and direct attacks. Traditional counter-UAV (C-UAS) systems currently available on the market are characterized by high cost, complexity of deployment, and limited efficiency against small, low-flying, or autonomous UAVs. These limitations underline the urgent need for affordable, modular, and autonomous solutions that can be integrated into existing security frameworks of strategic facilities.

This paper presents the design and evaluation of an integrated protection system, developed to detect, identify, and neutralize unauthorized UAVs in the vicinity of critical infrastructure. The architecture of the system consists of three main components: (1) a mobile radar station as the primary detection unit, (2) a ground control center with data processing and operator interface, and (3) an autonomous interceptor drone equipped with a vision-inertial navigation system and onboard artificial intelligence for object recognition and tracking.

The proposed approach emphasizes autonomy, cost-effectiveness, and adaptability. Unlike conventional C-UAS solutions, the interceptor drone operates independently of satellite navigation signals, relying on visual-inertial odometry supported by barometric and magnetic sensors. This ensures resilience in environments with electronic warfare interference or GPS denial. The navigation subsystem provides continuous trajectory estimation even under unfavorable visual conditions, thus maintaining stable tracking and positioning.

Target detection and tracking are implemented through a hybrid computer vision framework. A deep learning-based YOLOv11 detector is used for real-time UAV identification, combined with a CSRT (Channel and Spatial Reliability Tracker) algorithm to ensure robust multi-frame tracking. A dedicated dataset of more than 25000 annotated UAV images under diverse conditions was collected and used to train and optimize the detection model. For deployment on resource-constrained onboard hardware, the neural network was quantized and converted to INT8 format, providing a three-fold increase in inference speed with minimal accuracy loss. Experimental results demonstrated detection accuracy of 93% with a mean average precision (mAP50-95) of 0,73 and real-time processing at ~27 frames per second on the Orange Pi 5 Pro platform.

Prototype testing showed that complex can successfully detect, classify, and intercept UAVs at ranges up to 400 m. The modular structure allows independent use of subsystems, such as the navigation unit or recognition algorithms, in civil applications including environmental monitoring, industrial inspection, and public safety. However, certain limitations remain: the operational range is relatively short, weather conditions significantly affect vision-based detection, and performance against fast-maneuvering UAVs requires further enhancement.

The research concludes that provides a promising foundation for developing next-generation C-UAS solutions, combining affordability, autonomy, and modularity. Future work will focus on extending operational range, improving robustness under adverse weather, and implementing swarm-based coordination among multiple interceptors to counter UAV swarms. The proposed system demonstrates not only the feasibility of low-cost autonomous counter-drone technology but also its potential contribution to enhancing the resilience of critical infrastructure against rapidly evolving aerial threats.

Pages: 60-69
For citation

Shevelev D.N., Roslyakov A.V., Smelov N.I., Zadorina N.A., Lomanov A.N. Development of complex for protecting critical infrastructure facilities from unmanned aerial vehicles. Achievements of modern radioelectronics. 2025. V. 79. № 10. P. 60–69. DOI: https://doi.org/10.18127/j20700784-202510-08 [in Russian]

References
  1. Belousov S.A., Zakharov P.V. Sistemy radiotekhnicheskogo obnaruzheniya malogabaritnykh bespilotnykh letatel'nykh apparatov. Elektrosvyaz'. 2021. № 5. S. 18–25. [in Russian]
  2. Kurganov A.A., Laptev M.N. Obzor sovremennykh metodov obnaruzheniya i podavleniya BPLA. Vestnik Voennogo universiteta. 2020. № 4. S. 44–52. [in Russian]
  3. Wischnewski R., Borisov N., Rassõlkin A. Drone detection, classification and tracking using neural networks and multiple sensors. arXiv preprint arXiv:2206.04307. 2022. URL: https://arxiv.org/abs/2206.04307
  4. Lebedev P.V. Analiz uyazvimostey ob"ektov kriticheskoy infrastruktury k atakam s primeneniem BPLA. Informatsionnaya bezopasnost'. 2022. № 1. S. 34–41. [in Russian]
  5. Dedrone. Smart Airspace Security: Product Overview. URL: https://www.dedrone.com
  6. DroneShield. Counter-UAS Solutions: Technical Specifications. URL: https://www.droneshield.com
  7. RADA Electronic Industries. Multi-Mission Tactical Radar Systems. URL: https://www.rada.com
  8. Belyaev A.A., Mishchenko D.S. Sovremennye metody zashchity ot malykh bespilotnykh letatel'nykh apparatov: Obzor. Vestnik VKA im. Zhukovskogo. 2022. №4 (118). S. 45–56. [in Russian]
  9. Kuznetsov I.Yu. Radioelektronnaya bor'ba s BPLA: printsipy, tekhnologii, effektivnost'. Radioelektronika i svyaz'. 2023. T. 14. № 2. S. 92–99. [in Russian]
  10. Vasil'ev S.N. Avtomatizirovannye sistemy upravleniya i iskusstvennyy intellekt. M.: Mashinostroenie. 2021. [in Russian]
  11. Scaramuzza D., Fraundorfer F. Visual Odometry [Tutorial]. IEEE Robotics & Automation Magazine. 2011. T. 18. № 4. S. 80–92. URL: https://doi.org/10.1109/MRA.2011.943233
  12. Geneva P., Huang G. OpenVINS: A Research Platform for Visual-Inertial Estimation. URL: https://arxiv.org/abs/2003.06281
  13. Mourikis A.I., Roumeliotis S.I. A Multi-State Constraint Kalman Filter for Vision-aided Inertial Navigation. Proceedings of the 2007 IEEE International Conference on Robotics and Automation (ICRA). IEEE, 2007. S. 3565–3572. URL: https://doi.org/10.1109/ICRA.2007.4209647
  14. Gavrilov A.S., Shaburov A.A. Metody integratsii INS i opticheskikh sistem v zadachakh navigatsii BPLA. Nauchnyy zhurnal «Izvestiya YuFU. Tekhnicheskie nauki». 2021. № 3. S. 67–75. [in Russian]
  15. Treking ob"ektov v videopotoke na osnove svertochnykh neyronnykh setey i fraktal'nogo analiza. URL: https://repo.ssau.ru/bitstream/Informacionnye-tehnologii-i-nanotehnologii/Treking-obektov-v-videopotoke-na-osnove-svertochnyh-neironnyh-setei-i-fraktalnogo-analiza-69606/1/paper_376.pdf [in Russian]
  16. Neyrosetevaya sistema otslezhivaniya i raspoznavaniya ob"ektov v videopotoke. URL: https://s.top-technologies.ru/pdf/2018/12-1/37270.pdf [in Russian]
  17. Zolotukhin Yu.N. i dr. Otslezhivanie ob"ekta v videopotoke s pomoshch'yu svertochnoy neyronnoy seti. Avtometriya. 2020. T. 56. № 6. S. 100–106. URL: https://www.iae.nsk.su/images/stories/5_Autometria/5_Archives/2020/6/11_Zolotukhin.pdf [in Russian]
Date of receipt: 09.09.2025
Approved after review: 25.09.2025
Accepted for publication: 30.09.2025