• 2022 (Vol.36)
  • 1990 (Vol.4)
  • 1989 (Vol.3)
  • 1988 (Vol.2)
  • 1987 (Vol.1)

Analysis of human eye movements in the management of self-propelled chassis with video eye-tracking interface system

© 2017 Ya. A. Turovsky, S. D. Kurgalin, A. V. Alekseev

Voronezh State University, 394006, Voronezh, Universitetskaya pl., 1

Received 24 May 2016

The article presents the results of the study parameters of human eye movements when using them videookulogra cheskogo interface for controlling a self-propelled chassis. It is shown that in the course of an experiment conducted on arrival to the arrival reduces the time spent on subjects one run, decreased the number of errors and the number of subjects who did not realize its successful race. From the rst to the third round increases the amount of movement of the pupil in the “neutral”, the number of teams in the “neutral” from arrival to the arrival diminished. The amplitude of the eye movements for the team, “forward” was not signi cantly changed from arrival to arrival, but at the same time, this index decreased in size when generating the command “back”. With regard to the indicator “moving distance of the pupil” two clusters were identi ed. For one of them is characterized by a signi cantly greater distance by which to move the pupil at the “right” commands, “left” or “back”. During the training, there was the disappearance of the di erences between the two clusters. In the analysis of the results obtained by the criterion of success of the arrival completion, found that the race ended with success, had fewer teams, “forward” and “back”, meanwhile, the number in the “neutral” position turns teams and time spent for them do not di er signi cantly.

Key words: eye-tracking, eye-tracking interface, human-computer interface

Cite: Ya. A. Turovsky, Kurgalin S. D., Alekseev A. V. Analiz dvizheniya glaz cheloveka pri upravlenii samokhodnym shassi s ispolzovaniem sistemy videookulograficheskogo interfeisa [Analysis of human eye movements in the management of self-propelled chassis with video eye-tracking interface system]. Sensornye sistemy [Sensory systems]. 2017. V. 31(1). P. 51-58 (in Russian).


  • Barabancthikov V.A., Zhegallo A.V. Eyetracking: Methods of recording eye movements in psychological research and practice M. Kogito-Center, 2014. 128 p. [in Russian].
  • Borzunov S.V., Kurgalin S.D., Maksimov A.V., Turovskii Y.A. Estimation of the SSVEP-based braincomputer interface performance // Journal of computer and systems sciences international. 2014. V. 1. P. 121–129 [in Russian].
  • Glantz S. Primer of biostatistics. M. Praktika, 1998. 459 p. [in Russian].
  • Ignatovskii V.V., Pestunov D.A. Processing Oculography using ARM-microcontroller and computer science in Siberia // Bulletin Siberian science. 2014. V. 3 (13). P. 53–57 [in Russian].
  • Kubryak O.V., Grokhovsky S.S. Practical stabilometry. Static motor-cognitive tests with biofeedback for the support reaction. M.: Maska, 2012. 88 p. [in Russian].
  • Kurgalin S.D., Alekseev A.V., Turovsky Y.A. Computer system for monitoring eye movement // Computer science: problems, methodology, technology / Materials of the XV International Scienti c Conference. Voronezh. 2015. Р. 178–180 [in Russian].
  • Runyon R. Nonparametric Statistic. A Contemporary Approach. M. Finansy i statistica, 1982. 198 p. [in Russian].
  • Ryabchikova N.A., Poliansky V.B., Pletnev O.A., Bazyian B.H. Role of saccadic eye movements in cognitive processes Bulletin of Experimental Biology and Medicine. 2009. V. 147. N 1. P. 12–15 [in Russian].
  • Slavutskaya M.V., Moiseeva V.V., Shul’govskii V.V. Attention and eye movements. The structure of the oculomotor system, phenomenology and programming saccade //I.P. Pavlov Journal of Higher Nervous Activity. 2008. V. 58. P. 28–45 [in Russian].
  • Turovsky Ya.A., Alekseev A.V., Kiseleva E.V., Stogny O.S. Eye_Purpose_VI No 2014660844, Registered in Computer Program Register, October 16, 2014 а [in Russian].
  • Turovsky Ya.A., Alekseev A.V., Kiseleva E.V., Shaposhnikova T.V. GLAZDVIG 1.0 No 2014614928, registered on May 14, 2014 b in the Register of the computer programs [in Russian].
  • Filin V.A. Automaticity of saccades. M. MSU, 2002. 240 p. [in Russian].
  • Gao X., Xu D., Cheng M. A BCI-Based environmental controller for the motiondisabled // IEEE Transactions on neural systems and rehabilitation engineering. 2003. V.11. No 2. P. 137–140.
  • Lotte F., Congedo M., L ́ecuyer A. A review of classi cation algorithms for EEG-based brain-computer interfaces // J. Neural. Eng. 2007. V. 4. P. R1–R13.
  • Martin W.C. Upper limb prostheses: A Review of the literature with a focus on myoelectric Hands WorkSafeBC. Evidence-Based Practice Group, 2011. 90 p.
  • Microsoft (http://www.microsoft.com/en-us/kinectforwindows/). 2009.
  • SensoMotoric Instruments (http://www.smivision.com/en/ gaze-and-eye-tracking-systems/home.html). 2010.
  • Tobii (http://www.tobii.com/en/eye-tracking-research/ global/products/). 2010.
  • Wolpaw J.R., Birbaumer N., McFarland D.J. Brain-computer interfaces for communication and control // Clinical Neurophysiology. 2002. V.113. Р.767–791.
  • Zhu D., Bieger J., Molina G. A Survey of stimulation methods used in SSVEP-Based BCIs // Hindawi publishing corporation computational intelligence and neuroscience. 2010. Article ID702357.