Preview

Russian Technological Journal

Advanced search

Algorithmic support of the system of external observation and routing of autonomous mobile robots

https://doi.org/10.32362/2500-316X-2021-9-3-15-23

Abstract

This article presents the algorithmic support of the external monitoring and routing system of autonomous mobile robots. In some cases, the practical usage of mobile robots is related to the solution of navigation problems. In particular, the position of ground robots can be secured using unmanned aerial vehicles. In the proposed approach based on the video image obtained from an external video camera located above the working area of mobile robots, the location of both robots and nearby obstacles is recognized. The optimal route to the target point of the selected robot is built, and changes in its working area are monitored. Information about the allowed routes of the robot is transmitted to third-party applications via network communication channels. Primary image processing from the camera includes distortion correction, contouring and binarization, which allows to separate image fragments containing robots and obstacles from background surfaces and objects. Recognition of robots in a video frame is based on the use of a SURF detector. This technology extracts key points in the video frame and compares them with key points of reference images of robots. Trajectory planning is implemented using Dijkstra’s algorithm. The discreteness of the trajectories obtained using the algorithm for finding a path on the graph can be compensated for on board autonomous mobile robots by using spline approximation. Experimental studies have confirmed the efficiency of the proposed approach both in the problem of recognition and localization of mobile robots and in the problem of planning safe trajectories.

About the Authors

M. V. Egortsev
MIREA – Russian Technological University
Russian Federation

Maksim V. Egortsev, Postgraduate Student, Department of Control Problems, Institute of Cybernetics

78, Vernadskogo pr., Moscow, 119454 



S. A. K. Diane
MIREA – Russian Technological University
Russian Federation

Sekou Abdel Kader Diane, Cand. Sci. (Eng.), Associate Professor, Department of Control Problems, Institute of Cybernetics

ResearcherID: T-5560-2017, Scopus Author ID: 57188548666

78, Vernadskogo pr., Moscow, 119454



N. D. Kaz
MIREA – Russian Technological University
Russian Federation

Nikolai D. Kaz, Postgraduate Student, Department of Control Problems, Institute of Cybernetics

78, Vernadskogo pr., Moscow, 119454 



References

1. Lee D., Parangi A. CS 4758: Automated semantic mapping of environment. 2013. Available from URL: https://www.cs.cornell.edu/courses/cs4758/2013sp/final_projects/spring_2011/Dongsu_Aperahama.pdf

2. Le Saux B., Sanfourche M. Rapid semantic mapping: learn environment classifiers on the fly. In: Proc. 2013 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS). November 3−7, 2013. Tokyo, Japan. https://doi.org/10.1109/IROS.2013.6696888

3. Kurose J.F., Ross K.W. Computer networking: A topdown approach. 5th ed. Boston, MA: Pearson Education; 2010. 864 р. ISBN 978-0-13-136548-3

4. Preimushchestva IP-videonablyudeniya nad analogovym na primere BEWARD (Advantages of IP-video surveillance over analog by the example of BEWARD). Available from URL: https://www.beward.ru/articles/statya-preimushhestvaip-videonablyudeniya-nad-analogovym-na-primereoborudovaniya-beward/ (in Russ.).

5. Iofis E.A., Shebalin I.Yu. Fotokinotekhnika (Photocinotechnics). Moscow: Sovetskaya entsiklopediya; 1981, p. 80, 81 (in Russ.).

6. Helland T. A simple algorithm for correcting lens distortion. Available from URL:http://www.tannerhelland.com/4743/simple-algorithm-correcting-lens-distortion/

7. Manukova N.V. Computer vision as a means of extracting information from the video. Matematicheskie struktury i modelirovanie = Mathematical Structures and Modeling. 2015;4(36):123–128 (in Russ.).

8. Mikolajczyk K., Schmid C. A performance evaluation of local descriptors. IEEE Transactions on Pattern Analysis and Machine Intelligence. 2005;27(10):1615–1630. https://doi.org/10.1109/TPAMI.2005.188

9. Abilmazhinova B.S., Andreev V.O. Corner detectors or how to implement marker detection for augmented reality. Innovatsii v nauke. 2016;2(51):156–162 (in Russ.).

10. Ivashechkin A.P., Vasilenko A.Yu., Goncharov B.D. Methods for finding special points of an image and their descriptors. Molodoi uchenyi = Young scientist. 2016;15(119):138–140 (in Russ.).

11. Bovyrin A.V., Druzhkov P.N., Erukhimov V.L., Polovinkin A.N., et al. Razrabotka mul’timediinykh prilozhenii s ispol’zovaniem bibliotek OpenCV i IPP (Development of multimedia applications using the OpenCV and IPP libraries). Moscow: INTUIT; 2016. 515 p. (in Russ.).

12. Canny J. A computational approach to edge detection. IEEE Transactions on Pattern Analysis and Machine Intelligence. 1986;PAMI-8(6):679–698. https://doi.org/10.1109/TPAMI.1986.4767851

13. Blair B., Murphy C. Difference of gaussian scale-space pyramids for SIFT feature detection. 6.375: Complex Digital Systems Design Final Project Report, Spring 2007. Available from URL: http://www.ballardblair.com/projects/Difference_of_Gaussian_paper.pdf

14. Rudakov I.V., Vasiutovich I.M. Analysis of perceptual image hash functions. Nauka i obrazovanie: nauchnoe izdanie MGTU im. N.E. Baumana = Science and Education of Bauman MSTU. 2015;08:269–280 (in Russ.). http://dx.doi.org/10.7463/0815.0800596

15. Yuferev V.S. Local approximation by cubic splines. U.S.S.R. Comput. Math. Math. Phys. 1981;21(1):2–7. https://doi.org/10.1016/0041-5553(81)90126-9 [Yuferev V.S. Lokal’naya approksimatsiya kubicheskimi splainami. Zh. vychisl. matem. i matem. fiz. = USSR. Comput. Math. Math. Phys. 1981;21(1):5−10 (in Russ.).]


Supplementary files

1. Object detection using the SURF detector
Subject
Type Исследовательские инструменты
View (106KB)    
Indexing metadata ▾

This article presents the algorithmic support of the external monitoring and routing system of autonomous mobile robots. The video image was obtained from an external video camera located above the working area of mobile robots. The locations of both robots and nearby obstacles were recognized. This technology extracts key points in the video frame and compares them with key points of reference images of robots.

Review

For citations:


Egortsev M.V., Diane S.K., Kaz N.D. Algorithmic support of the system of external observation and routing of autonomous mobile robots. Russian Technological Journal. 2021;9(3):15-23. (In Russ.) https://doi.org/10.32362/2500-316X-2021-9-3-15-23

Views: 653


Creative Commons License
This work is licensed under a Creative Commons Attribution 4.0 License.


ISSN 2782-3210 (Print)
ISSN 2500-316X (Online)