• 2024 (Vol.38)
  • 1990 (Vol.4)
  • 1989 (Vol.3)
  • 1988 (Vol.2)
  • 1987 (Vol.1)

Frequency-time features of eye movement when using the video-oculographic interface in ergatic system control problems

© 2021 Y. A. Turovski, A. V. Alekseev, I. E. Lesnykh, E. V. Martynenko

Voronezh State University, 394018 Voronezh, Universitetskaya square, 1, Russia
V.A. Institute of Management Problems Trapeznikova RAS, 179971 Moscow, trade Union street 65, Russia

Received 15 Jul 2020

The paper is devoted to the development of the concept of the capabilities of video-oculographic interfaces in management tasks. The results of studying the parameters of human eye movement using a video-oculographic interface for controlling an object on a plane are presented. It is shown that in the course of the experiment, the number of errors and the number of subjects who did not perform successful control decreased from attempt to attempt, as well as the dependence of the ability of such control on the human temperament and working memory features. At high values of working memory, users make more sharp high-amplitude movements of the pupil with a period of up to 1.6 s, forming a control pattern, which eventually leads to more control errors and does not achieve the desired result. To a large extent, the results obtained are related to horizontal rather than vertical eye movements. The presented results will be useful for creating and applying human-computer interfaces in digital monitoring in the management of ergatic systems and can serve as a starting point for the development of high-speed oculographic interfaces with significantly broader functionality than the existing ones.

Key words: oculographic interface, ergatic systems, digital monitoring, human-computer interface

DOI: 10.31857/S0235009221010091

Cite: Turovski Y. A., Alekseev A. V., Lesnykh I. E., Martynenko E. V. Chastotno-vremennye osobennosti dvizheniya glaz pri ispolzovanii videookulograficheskogo interfeisa v zadachakh upravleniya ergaticheskimi sistemami [Frequency-time features of eye movement when using the video-oculographic interface in ergatic system control problems]. Sensornye sistemy [Sensory systems]. 2021. V. 35(1). P. 30–37 (in Russian). doi: 10.31857/S0235009221010091

References:

  • Vojtov V.K., Kosihin V.V., Ushakov D.V. Rabochaya pamyat' kak perspektivnyj konstrukt kognitivnoj psihologii i metody ego izmereniya [Working memory as a perspective construct of cognitive psychology and methods of its measurement]. Modelirovanie i analiz dannyh [Data modeling and analysis]. 2015. P. 57–78 (in Russian).
  • Glanc S. Mediko-biologicheskaya statistika [Biomedical statistics]. Moscow. Praktika, 1998. 459 p. (in Russian).
  • Dobeshi I. Desyat’ lekcij po vejvletam [Ten lectures on wavelets]. Moscow. NIC “Regulyarnaya i haoticheskaya dinamika” [SIC “Regular and chaotic dynamic”], 2001. 464 p. (in Russian)
  • Runion R. Spravochnik po neparametricheskoj statistike [Handbook of nonparametric statistics]. Sovremennyj podhod [Modern approach]. Moscow. Finansy i statistika [Finance and statistics]. 1982. 198 p. (in Russian)
  • Turovskij Ya.A., Alekseev A.V. Variabel’nost’ serdechnogo ritma pol’zovatelej videookulograficheskim interfejsom v processe obucheniya upravleniem samohodnym shassi [Heart rate variability of users of the video-oculographic interface in the process of learning how to operate a self-propelled chassis]. Vestnik VGU [Bulletin of VSU]. 2017a. № 1. P. 118–124 (in Russian)
  • Turovskij Ya.A., Borzunov S.V., Vahtin A.A., Alekseev A.V., Mamaev A.V. Variabel’nost’ serdechnogo ritma v hode obucheniya pol’zovatelej primeneniyu interfeisov chelovek-komp’yuter [Heart rate variability in the course of training users to use human-computer interfaces]. Vestnik VGU [Bulletin of VSU]. 2018. № 2. P. 255–263 (in Russian)
  • Turovskij Ya.A., Kurgalin S.D., Alekseev A.V. Analiz dvizheniya glaz cheloveka pri upravlenii samohodnym shassi s ispol’zovaniem sistemy videookulograficheskogo interfeisa [Analysis of human eye movements in the management of self-propelled chassis with video eyetracking interface system]. Sensornye systemy [Sensory systems]. 2017b. T. 31. № 1. P. 51–58 (in Russian)
  • Bissoli A., Lavino-Junior D., Sime M., Encarnação L., Bastos-Filho T. A human–machine interface based on eye tracking for controlling and monitoring a smart home using the internet of things. Sensors. 2019. 859 p. https://doi.org/10.3390/s19040859
  • Martin W.C. Upper Limb Prostheses: A review of the literature with a focus on myoelectric hands workSafeBC. Evidence-Based Practice Group. 2011. 90 p.
  • Maikrocoft. http://www.microsoft.com/en-us/kinectforwindows 2009.
  • Oyekoya O.K., Stentiford F.W.M. Eye tracking as a new interface for image retrieval. BT Technology Journal. 2004. P. 161–169. https://doi.org/10.1023/B:BTTJ.0000047130.98920.2b
  • Tobii. http://www.tobii.com/en/eye-tracking-research/global/products/ 2010.
  • Wolpaw J.R., Birbaumer N., McFarland D.J. Brain–computer interfaces for communication and control. Clinical Neurophysiology. 2002. V. 113. P. 767–791.
  • Zhu D., Bieger J., Molina G. A Survey of Stimulation Methods Used in SSVEP-Based BCIs. Hindawi publishing corporation computational intelligence and neuroscience. 2010. Article ID 702357.