350 rub
Journal Radioengineering №6 for 2015 г.
Article in number:
Efficiency analysis of information-theoretic measures in image registration
Authors:
S.V. Voronov - Ph. D. (Eng.), Junior Research Scientist, Department «Radio Engineering», Ulyanovsk State Technical University. E-mail: s.voronov@ulstu.ru I.V. Voronov - Post-graduate Student, Department «Radio Engineering», Ulyanovsk State Technical University. E-mail: ilvo1987@gmail.com
Abstract:
For the problem of estimating image registration parameters a comparative efficiency analysis of information quality criteria: Shannon mutual information, Renyi and Tsallis entropy is carried out. Experimental studies were carried out on simulated images with the cor-relation function and the probability density function of brightness close to Gaussian. As an interfering factor an unbiased additive Gaussian noise is used. Experiments show that for the recursive estimation of parameters of image registration Renyi entropy provides potentially greater rate of convergence of the estimated parameters. The same measure, due to the narrower width of the zone of inflection, provides a potentially less estimation error variance. According to these indicators slightly behind Tsallis entropy, but this measure has a wider operating range of effective assessment. Shannon mutual information behind Renyi entropy and Tsallis entropy in all investigated indices, and has a strong dependence on the noise level. However, it is worth noting that in terms of computational cost, Shannon mutual information is far ahead of other measures, therefore, is often preferred.
Pages: 32-36
References

 

  1. Goshtasby A.A. Image registration. Principles, tools and methods // Advances in Computer Vision and Pattern Recognition. Springer. 2012. 441 p.
  2. D\'Agostino E., Maes F., Vandermeulen D., Suetens P. An information theoretic approach for non-rigid image registration using voxel class probabilities // Med Image Anal. 2006. V. 6(3). P. 413−431.
  3. Viola P., Wells W.M. Alignment by maximization of mutual information // International Journal of Computer Vision. 1997. V. 24. P. 137−154.
  4. Voronov S.V. Ispolzovanie vzaimnojj informacii kak celevojj funkcii kachestva ocenivanija parametrov izobrazhenijj // Radiotekhnika. 2014. № 7. S. 88−94.
  5. Tashlinskijj A.G., KHoreva A.M., Smirnov P.V. Vybor konechnykh raznostejj pri nakhozhdenii psevdogradienta celevojj funkcii v procedurakh ocenivanija mezhkadrovykh deformacijj izobrazhenijj // Radiotekhnika. 2012. № 9. S. 56−60.
  6. Erdogmus D. Information theoretic learning: Renyi-s entropy and its applications to adaptive system training. PhDthesis // UniversityofFlorida. USA. 2002.
  7. Sevim Y., Atasoy A. Performance comparison of new nonparametric independent component analysis algorithm for different entropic indexes // Turkish Journal of Electrical Engineering & Computer Sciences. 2012. V. 20. P. 287−297.
  8. Krasheninnikov V.R. Osnovy teorii obrabotki izobrazhenijj: Uchebnoe posobie // Uljanovsk: UlGTU. 2003. 152 s.
  9. Tashlinskijj A.G., Voronov S.V. Analiz celevykh funkcijj pri rekurrentnom ocenivanii mezhkadrovykh geometricheskikh deformacijj izobrazhenijj // Naukoemkie tekhnologii. 2013. T. 14. № 5. S. 16−21.