350 rub
Journal Neurocomputers №11 for 2015 г.
Article in number:
Aspects of human oculomotor behavior according to construction eye movement models
Authors:
M.A. Shurupova - Post-graduate Student, Department of Higher Nervous Activity, Biological Faculty, Lomonosov Moscow State University. E-mail: shurupova.marina.msu@gmail.com
V.N. Anisimov - Ph.D. (Biol.), Leading Research Scientist, Department of Higher Nervous Activity, Biological Faculty, Lomonosov Moscow State University. E-mail: victor_anisimov@neurobiology.ru
A.V. Krasnoperov - Ph. D. (Phys.-Math.), Senior Research Scientist, Laboratory of Nuclear Issues,
Joint Institute for Nuclear Research (Dubna, Moscow Region). E-mail: alexei.krasnoperov@jinr.ru
A.V. Latanov - Dr.Sc. (Biol.), Professor, Head of Higher Nervous Activity Department of Biological Faculty, Lomonosov Moscow State University. E-mail: latanov@neurobiology.ru
Abstract:
Today there is wide interest for building models of eye movements. It could be based as on virtual solutions [11] so as on creat-ing robot-technical systems [2]. At the same time many of those solutions are the simplifications of real eye movements, have very sophisticated organization. In those cases developers miss a lot of eye movement details that our eyes realize every second during scanning the environment around us. Some of those details could not be analyzed by expert estimation during eye movement observing and could be revealed only by calculating statistics from big data matrices of gaze coordinates.
Eye movements, observed in real time, reflect the processes of visual attention and mental activity. For accurate visual processing our eyes execute about five (and even more) movements per second, continuously follow the spot of interest. It is needed for projecting the image to the fovea. That is why our eyes execute saccades and fixations. During fixations brain ana-lyzes details and encodes image in the memory [6].
Some researches show, that there are relations between fixation durations and saccade amplitudes [5, 21, 23]. They revealed the definite function between parameters of fixations and saccades: after short fixation long saccade occurs and vice versa - af-ter long fixation the eye executes low amplitude saccade [20]. First case is the example of ambient (dynamic) type of vision and the second one is the focal (static) type of vision.
In our work we discuss the results of analysis eye movements patterns during viewing static and dynamic scenes. We revealed, that there is relation between eye movement parameters and visual mode, we also show the function between fixations and sac-cades depending of cognitive task. Details of eye movement parameters relate to strongly expressed time and space features.
For static and dynamic scenes fixation durations decrease comparing with free viewing of the same scene. We revealed, that fix-ation durations and also saccade amplitudes are higher for dynamic scenes. We either averaged data in both viewing for all scenes separately for statics and dynamics and built functions of amplitude of the next saccade from current fixation duration. When participants got the task for current scene and started to examine it, parameters of eye movements varied. We revealed separation to ambient and focal modes at 160 ms for both static and dynamic viewing. That separation reflects crucial mechan-isms of eye movement strategy for human eye movements. That changes obviously show the differences in viewing strategy between the early and the later period of visual scene viewing.
The results also reveal competition between two types of visual attention in scene viewing. That are top-down and bottom-up types of visual attention. We could establish, that eye movements in humans are executed in strict relations between low-level oculomotor factors and high-level cognitive aspects of mental activity. This is kind of involuntary interaction, that is not well ob-served from expert estimation, but that exactly takes place in scene viewing in routine eye movement activity.
In this way, in viewing static and dynamic scenes it is possible to obtain cluster of saccades and fixations, which are related to that saccades, and that cluster characterizes processes of global visual attention.
The results and their interpretations are crucial as from neurophysiologic point of view, as for building eye movement models and creating systems of artificial vision.
Pages: 48-55
References
- Bartlett M., Movellan J., Sejnowski T. Face recognition by independent component analysis // IEEE Transactions on Neural Networks. 2002. V. 13. № 6. P. 1450-1464.
- Berthouze L., Bakker P., Kuniyoshi Y. Learning of Oculo-Motor Control: a Prelude to Robotic Imitation // Intelligent Robots and Systems. IROS 96. Proceedings of the 1996 IEEE/RSJ International Conference. IEEE. 1996. V. 1. P. 376-381.
- Dorr M., Martinetz T., Gegenfurtner K., Barth E. Variability of eye movements when viewing dynamic natural scenes // Journal of vision. 2010. V. 10. № 10. P. 1-17.
- Falchier A., Kennedy H. Connectivity of areas V1 and V2 in the monkey is profoundly influenced by eccentricity // FENS Abst. 2002. V. 1. P. 51-58.
- Frost D., Pöppel E. Different Programming Modes of Human Saccadic Eye-Movements as a Function of Stimulus Eccentricity - Indications of a Functional Subdivision of Visual-Field // Biological Cybernetics. 1976. V. 23. № 1. P. 39-48.
- Henderson J., Weeks J., Phillip A., Hollingworth A. The effects of semantic consistency on eye movements during complex scene viewing // Journal of Experimental Psychology: Human Perception and Performance. 1999. V. 25. P. 210-228.
- Henderson J. Regarding scenes// Curr. Dir. Psychol. Sci. 2007. V. 16. №4. P. 219-222.
- Itti L., Koch C. A saliency-based search mechanism for overt and covert shifts of visual attention // Vision Research. 2000. V. 40. № 10. P. 1489-1506.
- Le Meur O., Le Callet P., Barba D., Thoreau D. A coherent computational approach to model the bottom-up visual attention // IEEE Transactions on Pattern Analysis and Machine Intelligence (PAMI). 2006. V. 28. № 5. P. 802-817.
- Manor B., Gordon E. Defining the temporal threshold for ocular fixation in free-viewing visuocognitive tasks // Journal of Neuroscience Methods. 2003. V. 128. P.85-93.
- Mark A.S., David B., Gordon D.M., Peter J.H. A Virtual Environment and Model of the Eye for Surgical Simulation // Proceedings of the 21st annual conference on Computer graphics and interactive techniques. ACM. 1994. P. 205-212.
- Mills M., Van der Stigchel S., Hollingworth A., Hoffman L., Dodd M. Examining the influence of task-set on eye movements and fixations // Journal of Vision. 2011. V. 11. № 8. P. 1-15.
- Norman J. Two visual systems and two theories of perception // Behavioral and Brain Sciences. 2002. V. 25. № 1. P. 73-144.
- Nuthmann A., Smith T., Engbert R., Henderson J. CRISP: A computational model of fixation durations in scene viewing // Psychological Review. 2010. V. 117. P. 382-405.
- Pannasch S., Helmert J., Roth K., Herbold A., Walter H. Visual fixation durations and saccade amplitudes: Shifting relationship in a variety of conditions // Journal of Eye Movement Research. 2008. V. 2. P. 1-19.
- Peterson S., Posner M. The Attention System of the Human Brain: 20 Years After // Annual Review Neuroscience. 2012. № 35. P. 73-89.
- Rayner K. Eye Movements in Reading and Information Processing: 20 Years of Research // Psychological Bulletin. 1998. V. 124. № 3. P. 372-422.
- Smith T., Mital P. Attentional synchrony and the influence of viewing task on gaze behavior in static and dynamic scenes // Journal of Vision. 2013. V. 13. № 8. P. 1-24.
- Tatler B., Vincent B. Systematic tendencies in scene viewing // Journal of Eye Movement Research. 2008. V. 2. № 2. P. 1-18.
- Velichkovsky B., Domhoefer S., Pannasch S., Unema P. Visual Fixations and Level of Attentional Processing // Eye tracking research and applications / eds. A.Duhowski. Palm Beach Gardens, NY: ACM Press. 2000. P. 79-85.
- Velichkovsky B., Rothert A., Kopf M., Dornhoefer S., Joos M. Towards an express diagnostics for level of processing and hazard perception // Transportation Research, Part F. 2002. V. 5. № 2. P. 145-156.
- Velichkovsky B., Joos M., Helmert J., Pannasch S. Two visual systems and their eye movements: Evidence from static and dynamic scene perception// Proceedings of the XXVII Conference of the Cognitive Science Society / Eds. B. Bara, L. Barsalou, M. Bucciarelli. Mahwah, NJ: Lawrence Erlbaum. 2005. P. 2283-2288.
- Unema P., Pannasch S., Joos M., Velichkovsky B. Time course of information processing during scene perception: The relationship between saccade amplitude and fixation duration // Visual Cognition. 2005. V. 12. № 3. P. 473-494.
- Ungerleider L., Mishkin M. Two cortical visual systems // Analysis of visual behavior/ Eds. D. Ingle, M. Goodale, R. Mansfield.Cambridge, MA: MIT Press. 1982. P. 549-586.