350 rub
Journal Radioengineering №5 for 2019 г.
Article in number:
Technology of a multispectral video panorama forming
Type of article: scientific article
DOI: 10.18127/j00338486-201905(II)-22
UDC: 004.021
Authors:

I.A. Kudinov – Post-graduate Student, Department of Computer, 

Ryazan State Radio Engineering University

E-mail: igor.kudinov@mail.ru

M.B. Nikiforov – Ph.D.(Eng.), Associate Professor, Department of Computer, 

Ryazan State Radio Engineering University

E-mail: nikiforov.m.b@evm.rsreu.ru

I.S. Kholopov – Ph.D.(Eng.), Associate Professor, Department «Radio Systems», 

Ryazan State Radio Engineering University

E-mail: kholopov.i.s@rsreu.ru

Abstract:

The technology of video image generation according to information from the distributed multi-spectral cameras of the panoramic optical-electronic vision system is considered. The main problems that reduce the quality of stitching of the panoramic frame composed from multi-spectral camera frames are analyzed. The geometric formulation of the problem and the main analytical expressions describing the procedure for the spherical panorama forming without evaluating the point features of the scene and finding matches between them using descriptors are given. A robust to shooting conditions panorama forming algorithm based on the results of preliminary photogrammetric calibration of multi-spectral cameras using a special test object, Euler angles of the reference camera given by MEMS inertial measurement unit and distance to the objects of shooting by the laser range meter signals is presented. The main operation modes of a prototype of a panoramic vision system with television and infrared cameras developed by the authors are considered. It is shown that parallelization of computations using CUDA technology makes it possible to realize functions of improving vision (like image blending and Multiscale retinex), including image fusion from different spectral sensors, for two independently controlled 1024×768 pixels regions of interest with a frequency of at least 30 Hz.

Pages: 198-204
References
  1. Belskii A., Zhosan N., Brondz D., Gorbachev K., Grebenshchikov V., Kargaev A. Kruglosutochnaya panoramnaya sistema tekhnicheskogo zreniya dlya vertoletov. Fotonika. 2013. T. 38. № 2. S. 80−86.
  2. Kanaeva I.A., Bolotova Yu.A. Metody korrektsii tsveta i yarkosti pri sozdanii panoramnykh izobrazhenii. Kompyuternaya optika. 2018. T. 42. № 5. S. 885−897.
  3. Obrabotka izobrazhenii v aviatsionnykh sistemakh tekhnicheskogo zreniya. Pod red. L.N. Kostyashkina i M.B. Nikiforova. M.: FIZMATLIT. 2016. 234 c.
  4. Efimov A.I., Kostyashkin L.N., Loginov A.A., Muratov E.R., Nikiforov M.B., Novikov A.I. Obrabotka izobrazhenii v mnogospektralnykh sistemakh tekhnicheskogo zreniya. Vestnik Ryazanskogo gosudarstvennogo radiotekhnicheskogo universiteta. 2017. № 60. S. 83−92.
  5. Bondarenko A., Bondarenko M. Apparatno-programmnaya realizatsiya multispektralnoi sistemy uluchshennogo videniya. Sovremennaya elektronika. 2017. № 1. S. 32−37.
  6. Lloid Dzh. Sistemy teplovideniya: per. s angl. Pod red. A.I. Goryacheva. M.: Mir. 1978. 414 s.
  7. Szeliski R. Image alignment and stitching: a tutorial. Foundations and trends in computer graphics and vision. 2006. V. 2. № 1. P. 1−104.
  8. Liu Y.-C., Lin K.-Y., Chen Y.-S. Bird’s-eye view vision system for vehicle surrounding monitoring. Proc. of 2nd Int. Workshop on Robot Vision (RobVis 2008). 18−20 February 2008. Auckland. 2008. P. 207−218.
  9. Lin C.-C., Wang M.-S. A vision based top-view transformation model for a vehicle parking assistant. Sensors. 2012. V. 12. № 4. P. 4431−4446.
  10. Li M., Zhao C., Hou Y., Ren M. A new lane line segmentation and detection method based on inverse perspective mapping. Int. J. of digital content technology and its applications. 2011. V. 5. № 4. P. 230−236.
  11. Zhou Q.-F., Liu J.-H., Wang X., Sun M.-C. Automatic correction of geometric distortion in aerial zoom squint imaging. Optics and precision engineering. 2015. V. 23. № 10. P. 2927−2942.
  12. Laganiere R. Compositing a bird’s eye view mosaic. Proc. of 13th Canadian Conf. on Vision Interface. May 2000. Montreal. 2000. P. 382−387.
  13. KHolopov I.S. Algoritm korrektsii proektivnykh iskazhenii pri malovysotnoi s'emke. Kompyuternaya optika. 2017. T. 41. № 2. S. 284−290.
  14. Gruzman I.S., Kirichuk V.S., Kosykh V.P., Peretyagin G.I., Spektor A.A. tsifrovaya obrabotka izobrazhenii v informatsionnykh sistemakh: Ucheb. posobie. Novosibirsk: Izd-vo NGTU. 2002. 352 s.
  15. Hartley R., Zisserman A. Multiple view geometry in computer vision. 2nd edition. Cambridge: Cambridge University Press. 2003. 656 r.
  16. Chelnokov Yu.N. Kvaternionnye i bikvaternionnye modeli i metody mekhaniki tverdogo tela i ikh prilozheniya. Geometriya i kinematika dvizheniya. M.: FIZMATLIT. 2006. 512 s.
  17. Kuipers J.B. Quaternions and rotation sequences. A primer with applications to orbits, aerospace, and virtual reality. New Jersey: Princeton University. 1999. 391 p.
  18. St-Laurent L., Mikhnevich M., Bubel A., Prevost D. Passive Calibration Board for Alignment of VIS-NIR, SWIR and LWIR Images. Quantitative InfraRed Thermography Journal. 2017. V. 14. № 2. P. 193−205.
  19. Brown D.C. Close-range camera calibration. Photogrammetric engineering. 1971. V. 37. № 8. P. 855−866.
  20. Luhmann T., Robson S., Kyle S., Boehm J. Close-range photogrammetry and 3D imaging. 2nd edition. Berlin: De Gruyter. 2013. 684 p.
  21. Geiger A., Moosmann F., Car O., Schuster B. Automatic camera and range sensor calibration using a single shot. Proc. of IEEE International Conference on Robotics and Automation (ICRA’2012). Saint Paul (Minnesota, USA). 2012. P. 3936−3943.
  22. Real-time rendering. Ed. by T. Akenine-Möller, E. Haines, N. Hoffman. 3rd edition. Wellesley: A.K. Peters. 2008. 1045 p.
  23. Belokurov V.A. Sistema uglovoi orientatsii na osnove gaussovskogo partsialnogo filtra. Vestnik Ryazanskogo gosudarstvennogo radiotekhnicheskogo universiteta. 2016. № 56. S. 11−16.
  24. Belokurov V.A. Primenenie avtokovariatsionnogo metoda naimenshikh kvadratov v invariantnoi skheme uglovoi orientatsii. Vestnik Ryazanskogo gosudarstvennogo radiotekhnicheskogo universiteta. 2018. № 64. S. 9−16.
  25. Bondarenko M.A., Bondarenko A.V. Formirovanie izobrazhenii v multispektralnykh videosistemakh dlya vizualnogo i avtomaticheskogo nerazrushayushchego kontrolya. Uspekhi prikladnoi fiziki. 2018. T. 6. № 4. S. 325−332.
  26. Kudinov I.A., Pavlov O.V., KHolopov I.S., KHramov M.Yu. Formirovanie videopanoramy po informatsii ot raznospektralnykh kamer. Informatsionnye tekhnologii i nanotekhnologii. Sb. trudov IV Mezhdunar. konf. i molodezhnoi shkoly. Samarskii natsionalnyi issledovatelskii universitet imeni akademika S.P. Koroleva. 2018. S. 568−575.
  27. Sanders J., Kandrot E. CUDA by example. New York: Addison-Wesley. 2010. 290 p.
  28. Jobson D.J., Rahman Z., Woodell G.A. A Multiscale Retinex for bridging the gap between color images and the human observation of scenes. IEEE Trans. on Image Processing. 1997. V. 6. № 7. P. 965−976.
Date of receipt: 10 апреля 2019 г.