350 rub
Journal Information-measuring and Control Systems №7 for 2015 г.
Article in number:
Method of navigation for mobile robot based on data fusion of video and onboard sensor data
Authors:
V.F. Filaretov - D.Sc. (Eng.), Professor, Head of Laboratory Robotic systems, Institute of Automation and Control Processes FEB RAS, Head of Department Automation and Control, Far Eastern Federal University (Vladivostok). E-mail: filaret@pma.ru D.A. Yukhimets - Ph.D. (Eng.), Senior Research Scientist, Laboratory Robotic systems, Institute of Automation and Control Processes FEB RAS (Vladivostok). E-mail: undim@iacp.dvo.ru A.A. Novitsky - Post-graduate Student, Laboratory Robotic systems, Institute of Automation and Control Processes FEB RAS (Vladivostok). E-mail: alexzander@iacp.dvo.ru
Abstract:
In this paper the method of development of navigation system for mobile robots on the base of data fusion of sensors data and video is proposed. It allows increase the accuracy of navigation system when global navigation system is unavailable. The typical webcam is source of video data. The algorithm of navigation system consists of two stages. On the first stage after image processing the data about robot movement and on the second stage the data fusion of this data with data from onboard sensors is doing. On the first stage after the image processing the data about robot movement is formed and on the second stage the data fusion of this data with data from onboard sensors is doing. The carried out of experimental researches confirm the high efficacy and accuracy of proposed algorithm. Using only on-board sensors and video cameras only as a navigation system mobile robot gives unsatisfactory results, as well as MR-pathing obtained based on the model using the program signals only. The error position detection for the trajectory using only onboard sensors was up to 20 cm, and using only the camera - up to 30 cm. The use of the proposed method of interconnecting the signals received simultaneously from naviga-tion sensors, and from the camera MP, has allowed to determine its current position with much greater accuracy, in which error in deter-mining the coordinates did not exceed 5 cm.
Pages: 68-75
References

 

  1. Andreev V.P., Kirsanov K.B., Kostin A.V., Kuvshinov S.V., Marzanov JU.S., Pankratov D.A., Prysev E.A., Prjanichnikov V.E., Rybak T.N., KHarin K.V., SHipovalov E.A. Mobilnye tekhnologicheskie roboty i trenazhery: integracionnoe programmnoe obespechenie gruppovogo vzaimodejjstvija // Informacionno-izmeritelnye i upravljajushhie sistemy. 2013. T. 11. № 4. S. 74-79.
  2. Pavlovskijj V.E., Zabegaev A.N., Kalinichenko A.V., Pavlovskijj V.V. Obedinennaja sistema navigacii mobilnogo robota po majakam i videoorientiram // Mekhatronika, avtomatizacija, upravlenie. 2011. № 10. S. 66-71.
  3. Kizimov A.T., Berezin D.R., Karabash D.M., Letunov D.A. Besplatformennaja inercialnaja kursovertikal dlja legkogo bespilotnogo letatelnogo apparata // Datchiki i sistemy. 2011. № 4. S. 37-42.
  4. Li R., Zhu Y.M., Han C.Z. Unified optimal linear estimation fusion // Proceedings of International Conference on Information Fusion, MoC2.10-MoC2.17, Paris, France. 2000. R. 486-492.
  5. Maimone M., Cheng Y., Matthies L. Two Years of Visual Odometry on the Mars Exploration Rovers // Journal of Field Robotics, Special Issue on Space Robotics. 2007. V. 24. № 3. P. 169-186.
  6. Dille M., Grocholsky B., Singh S. Outdoor Downward-facing Optical Flow Odometry with Commodity Sensors // Field and Service Robotics: Springer Tracts in Advanced Robotics. 2010. V. 62. P. 183-193.
  7. Brinkworth R., O-Carroll D. Robust Models for Optic Flow Coding in Natural Scenes Inspired by Insect Biology // PLOS Computation Biology. 2009. V. 5. № 11. P. 1-8.
  8. Lebedev I.M., Tjukin A.M., Priorov A.L. Razrabotka i issledovanie sistemy navigacii vnutri pomeshhenijj dlja mobilnogo robota s vozmozhnostju detektirovanija prepjatstvijj // Informacionno-izmeritelnye i upravljajushhie sistemy. 2015. № 1. S. 53-61.
  9. Siciliano, B., Hattib O. Handbook of Robotics. Berlin Heidelberg: Springer-Verlag. 2008. 1628 r.
  10. Campbell J., Sukthankar R., Nourbakhsh I., Pahwa A. A Robust Visual Odometry and Precipice Detection System Using Consumer-grade Monocular Vision // Proceedings of the 2005 IEEE International Conference on Robotics and Automation. 2005. P. 3421-3427.
  11. Tomasi C., Kanade T. Detection and Tracking of Point Features // Pattern Recognition. 2004. V. 37. P. 165-168.
  12. Bruce D., Kanade T. An Iterative Image Registration Technique with an Application to Stereo Vision // Proceedings of Imaging Understanding Workshop. 1981. P. 121-130.
  13. Haykin S. Kalman filtering and neural networks. John Wiley and Sons. 2001. 298 p.
  14. Julier S.J., Uhlmann J.K. A new extension of the Kalman filter to nonlinear systems // Proceedings of AeroSense: The 11th International Symposium on Aerospace: Defence Sensing, Simulation and Controls. 1997. P. 182-193.
  15. Prjanichnikov V.E., Andreev V.P., Ivchenko V.D., Kijj K.I. Kirsanov K.B., Levinskijj B.M., Marzanov JU.S., Nikitina T.A., Prysev E.A.Mobilnye tekhnologicheskie roboty: sistema szhatogo opisanija i analiza cvetnykh izobrazhenijj v realnom masshtabe vremeni // Informacionno-izmeritelnye i upravljajushhie sistemy (vyp. Intellektualnye adaptivnye roboty. 2011. T. 6. № 1-2). 2011. T. 9. № 9. S. 45-51.
  16. Prjanichnikov V.E., Andreev V.P., Kijj K.I., Kirsanov K.B., Levinskijj B.M., Platonov A.K. Analiz cvetnykh izobrazhenijj dlja mobilnykh robotov // Sbornik rasshirennykh tezisov Mezhdunarodnojj konferencii «Mobilnye roboty i mekhatronnye sistemy», posvjashhennojj 300-letiju so dnja rozhdenija M.V. Lomonosova i 90-letiju akad. D.E. Okhocimskogo / pod red. professora JU.G. Martynenko. M.: Izd-vo Moskovskogo universiteta. 2011. S.140-149.