Preview

Russian Technological Journal

Advanced search

Efficiency of YOLO neural network models applied for object recognition in radar images

https://doi.org/10.32362/2500-316X-2025-13-4-25-36

EDN: WVWVCJ

Abstract

Objectives. The paper addresses the problem of applying neural networks for object detection in radar images and their recognition under conditions of limited computational resources. The aim was to investigate the speed and recognition quality of YOLO2 neural network models in solving object detection and classification tasks in radar images in order to evaluate the feasibility of their practical implementation on a microcomputer with a neural processor.

Methods. Machine learning, object detection, and classification techniques were used to detect and classify objects in a radar image.

Results. The study compared the speed and recognition quality of the 5th, 8th, and 11th generation YOLO neural network models with varying numbers of trainable parameters (nano-, small-, medium-, large-, and extra-largesized) to assess their potential use on a microcomputer with a neural processor. As a result of comparing various YOLO models using evaluation metrics, YOLOv11n (0.925), YOLOv5l (0.889), and YOLOv11s (0.883) showed the highest precision metric; YOLOv5n (0.932), YOLOv11n (0.928), and YOLOv11s (0.914) showed the highest recall metric; YOLOv11s (0.961), YOLOv5n (0.954), and YOLOv11n (0.953) showed the highest mAP50 metric; and YOLOv5n (0.756), YOLOv11s (0.74), and YOLOv5l (0.727) showed the highest mAP50-95 metric.

Conclusions. The conducted research confirmed the feasibility of running YOLO neural network models on a microcomputer with a neural processor, provided that the computational resources of the microcomputer match the computational requirements of the neural networks. The ROC-RK3588S-PC microcomputer (Firefly Technology Co., China) provides up to 6 TOPS of performance, allowing the use of YOLOv5n (7.1 GFLOPs), YOLOv11n (6.3 GFLOPs), and YOLOv11s (21.3 GFLOPs) models.

About the Authors

Alena S. Krasnoperova
Tomsk State University of Control Systems and Radioelectronics
Russian Federation

Alena S. Krasnoperova, Engineer of the Student Design Bureau of Intelligent Radio Engineering Systems, Department of Radio Engineering Systems

40, Lenina pr., Tomsk, 634050 


Competing Interests:

The authors declare no conflicts of interest



Alexander S. Tverdokhlebov
Tomsk State University of Control Systems and Radioelectronics
Russian Federation

Alexander S. Tverdokhlebov, Engineer of the Student Design Bureau of Intelligent Radio Engineering Systems, Department of Radio Engineering Systems

40, Lenina pr., Tomsk, 634050 


Competing Interests:

The authors declare no conflicts of interest



Alexey A. Kartashov
Tomsk State University of Control Systems and Radioelectronics
Russian Federation

Alexey A. Kartashov, Engineer of the Student Design Bureau of Intelligent Radio Engineering Systems, Department of Radio Engineering Systems

40, Lenina pr., Tomsk, 634050 


Competing Interests:

The authors declare no conflicts of interest



Vladislav I. Weber
Tomsk State University of Control Systems and Radioelectronics
Russian Federation

Vladislav I. Weber, Postgraduate Student, Assistant, Department of Radio Engineering Systems

40, Lenina pr., Tomsk, 634050 


Competing Interests:

The authors declare no conflicts of interest



Vladimir Y. Kuprits
Tomsk State University of Control Systems and Radioelectronics
Russian Federation

Vladimir Y. Kuprits, Cand. Sci. (Eng.), Associate Professor, Head of the Student Design Bureau of Intelligent Radio Engineering Systems, Department of Radio Engineering Systems

40, Lenina pr., Tomsk, 634050 


Competing Interests:

The authors declare no conflicts of interest



References

1. Malmgren-Hansen D., Engholm R., Østergaard Pedersen M. Training Convolutional Neural Networks for Translational Invariance on SAR ATR. In: Proceedings of EUSAR 2016: 11th European Conference on Synthetic Aperture Radar. IEEE; 2016. P. 459–462.

2. Cruz H., Véstias M.P., Monteiro J., et al. A Review of Synthetic-Aperture Radar Image Formation Algorithms and Implementations: A Computational Perspective. Remote Sens. 2022;14(5):1258. https://doi.org/10.3390/rs14051258

3. Il’in E.M., Polubekhin A.I., Savostyanov V.Yu., Samarin O.F., Cherevko A.G. Airborne multi-functional radar complex for shot-range UAVs. Vestnik SibGUTI = The Herald of the Siberian State University of Telecommunications and Information Science. 2017;4:104–109 (in Russ.). https://www.elibrary.ru/item.asp?id=30793295

4. Paul V.G., Simonov A.V. Space radar terrain survey and the joint flight of a spacecraft pair. Inzhenernyi zhurnal: nauka i innovatsii = Engineering Journal: Science and Innovation. 2020;7:1–21 (in Russ.). https://doi.org/10.18698/2308-6033-2020-7-1999, https://www.elibrary.ru/item.asp?id=43566045

5. Khakhulina N.B. Sistemy sbora i obrabotki informatsii rezul’tatov geodezicheskikh izyskanii i distantsionnogo zondirovaniya (Systems of Information Collection and Processing of Geodetic Surveys and Remote Sensing Results). Voronezh: Voronezh State Technical University; 2022. 78 p. (in Russ.).

6. Kondratenkov G.S. (Ed.). Radiolokatsionnye stantsii vozdushnoi razvedki (Airborne Reconnaissance Radar Stations). Moscow: Voenizdat; 1983. 154 p. (in Russ.).

7. Kanaschenkov A.I., Merkulov V.I. (Eds.). Radiolokatsionnye sistemy mnogofunktsional’nykh samoletov: V 3 t. T. 1. RLS – informatsionnaya osnova boevykh deistvii mnogofunktsional’nykh samoletov. Sistemy i algoritmy pervichnoi obrabotki radiolokatsionnykh signalov (Radar Systems of Multi-Functional Aircraft: in 3 v. V. 1. Radar Systems – Information Basis for Combat Operations of Multi-Functional Aircraft. Systems and Algorithms for Primary Processing of Radar Signals). Moscow: Radiotekhnika; 2006. 656 p. (in Russ.).

8. Kupryashkin I.F., Mazin A.S. Classification of military equipment targets on radar images generated in noise interference conditions using a convolutional neural network. Vestnik Kontserna VKO Almaz-Antey. 2022;1:71–81 (in Russ.). https://www.elibrary.ru/item.asp?id=48138675

9. Chen D., Ju R., Tu C., Long G., Liu X., Liu J. GDB-YOLOv5s: Improved YOLO-based Model for Ship Detection in SAR Images. IET Image Process. 2024;18(11):2869–2883. https://doi.org/10.1049/ipr2.13140

10. Song Y., Wang S., Li Q., Mu H., Feng R., Tian T., Tian J. Vehicle Target Detection Method for Wide-Area SAR Images Based on Coarse-Grained Judgment and Fine-Grained Detection. Remote Sens. 2023;15(13):3242. https://doi.org/10.3390/rs15133242

11. Zhang T., Zhang X., Li J., Xu X., Wang B., Zhan X., Xu Y., Ke X., Zeng T., Su H., et al. SAR Ship Detection Dataset (SSDD): Official Release and Comprehensive Data Analysis. Remote Sens. 2021;13(18):3690. https://doi.org/10.3390/rs13183690

12. Karmanova N.A., Karmanov A.G., Petrov A.A. Development of a synthetic aperture radar model for unmanned aerial vehicles for remote sensing of woodlands. Informatsiya i Kosmos = Information and Space. 2021;4:114–122 (in Russ.). https://www.elibrary.ru/item.asp?edn=esiivj

13. BryzgalovA.P., Koval’chuk I.V., KhnykinA.V., Shevela I.A., Yusupov R.G. Simulation of Synthetic Aperture Radar Assigned to Solving the Problems of Its Internal and External Design. Trudy MAI. 2011;43:25 (in Russ.). https://www.elibrary.ru/item.asp?id=15632049

14. Terven J., Córdova-Esparza D.-M., Romero-González J.-A. A comprehensive review of YOLO architectures in computer vision: from YOLOv1 to YOLOv8 and YOLO-NAS. Mach. Learn. Knowl. Extr. 2023;5(4):1680–1716. https://doi.org/10.3390/make5040083

15. Baller S., Jindal A., Chadha M., Gerndt M. DeepEdgeBench: benchmarking deep neural networks on edge devices. In: Proceedings of the 2021 IEEE International Conference on Cloud Engineering (IC2E). IEEE; 2021. P. 20–30. https://doi.org/10.1109/IC2E52221.2021.00016


Review

For citations:


Krasnoperova A.S., Tverdokhlebov A.S., Kartashov A.A., Weber V.I., Kuprits V.Y. Efficiency of YOLO neural network models applied for object recognition in radar images. Russian Technological Journal. 2025;13(4):25-36. https://doi.org/10.32362/2500-316X-2025-13-4-25-36. EDN: WVWVCJ

Views: 25


Creative Commons License
This work is licensed under a Creative Commons Attribution 4.0 License.


ISSN 2782-3210 (Print)
ISSN 2500-316X (Online)