Eye Gaze System
Eye gaze systems are input devices that are mounted on the user's head that can be used to operate a computer system by tracking the eye gaze of the user. Eye tracking is the process of tracking the moment of eye and determining where the user is looking on the screen, allowing the user to navigate a computer without the user of their hands or voice. These technologies can be very useful for people with limited voluntary motor control such as those with cerebral palsy, spinal cord/brain injury, or muscular dystrophy. These users do however have to have voluntary control over their eye movement. In this type of system control keys are displayed on the screen, and by briefly resting their gaze on a control key a user can operate the computer, type words, navigate the internet, among other typical computer functions.
Research into eye gaze input systems began many years ago, notably with the creation of ERICA in 1989 (Hutchinson, White, Martin, Reichert, & Frey, 1989). By the year 2000 the Eye-gaze Response Interface Computer Aid (ERICA) was optimized for the Windows operating system allowing individuals to reliably perform all actions of the mouse and the keyboard with their eye gaze (Lankford, 2000). Over the past 2 decades, as computer processing power has increased, eye tracking has progressed to a point where these systems can accurately measure eye gaze in real time (Majaranta & Bulling, 2014).
While generally effective, these systems are not always easy to use or perfectly accurate. These systems require user calibration in order to maximize eye gaze location accuracy. Furthermore, our eyes are always making micro-movements, which can create a ‘jittery’ cursor, which some individuals find bothersome. It is therefore recommended to smooth the cursor movement in the system settings as a stable cursor is more comfortable to use (Majaranta & Bulling, 2014).
Traditionally a user makes a selection with these systems by resting their eyes briefly on a point of the screen. As a result, a common complaint of eye-gaze system users is accidental clicks. To address this issue a dwell time is recommended. By introducing a brief delay (dwell time) the system can differentiate simple viewing and a gaze-control click, which will reduce these erroneous clicks. The duration of this delay depends on the person, experts often prefer shorter times (e.g., 300ms) while novice users prefer longer times (e.g., 1000ms; Majaranta, 2012).
The usability of eye-gaze input systems has improved drastically and have even been shown in some studies to be faster than using a mouse. One study that compared usability across various age groups found that the eye-gaze input system led to a faster pointing time as compared with mouse input, especially for older adults (Murata, 2006). This researcher asserted that eye gaze systems may compensate for the declined motor functions of older adults when using mouse input (Murata).
More recently, eye gaze systems have been combined with other input tools in efforts to overcome some of their limitations (accidental clicking, eye strain, etc.). One team combined an eye gaze system with a Brain–Computer Interface. While the details of this system are beyond the scope of this review, this system was very effective and capable of dealing with different stimulus complexities (Zander, Gaertner, Kothe, & Vilimek, 2010). Another team combined an eye gaze system input with a switch (a button used to click) and found that performance speed increased by around 50% over using an eye gaze system alone; however, they recognize that this may not be feasible for many eye gaze system users due to limited mobility (Cecotti, 2016). While current eye gaze systems have been shown to be effective, many researchers are working on furthering the usability, speed, and accuracy of these systems. These endeavours include projects such as ‘silicon retinas’ that would increase eye track speed greatly across a variety of environmental lighting conditions (Liu & Delbruck, 2010).
Research Rating: Due to the experimental nature of the information cited in this description this information is to be trusted as valid and reliable.
Can be used effectively be those with very limited mobility
Can complete all computer functions
Becoming increasingly affordable and mobile
Require comprehensive training in order to use
While efficient alone, may be more efficient when paired with a switch or brain-computer interface.
Software settings will need to be tweaked as the user progresses from novice to expert skills (e.g., dwell click time may need to be reduced)
When choosing an eye tracking system, one should pay attention to the hardware’s gaze tracking features as well as the accompanying software and additional accessories. This process should be completed under the supervision of an Occupational Therapist or other healthcare professional.
Special Consideration: Workflow
Exact prices change frequently, which is why only approximate ranges are listed.
$ - Under $5
$$ - Between $6 and $50
$$$ - Between $51 and $250
$$$$ - Over $250
Cecotti, H. (2016). A multimodal gaze-controlled virtual keyboard. IEEE Transactions on Human-Machine Systems, 46(4), 601-606.
Hutchinson, T. E., White, K. P., Martin, W. N., Reichert, K. C., & Frey, L. A. (1989). Human-computer interaction using eye-gaze input. IEEE Transactions on systems, man, and cybernetics, 19(6), 1527-1534.
Lankford, C. (2000, November). Effective eye-gaze input into windows. In Proceedings of the 2000 symposium on Eye tracking research & applications(pp. 23-27). ACM.
Liu SC, Delbruck T (2010) Neuromorphic sensory systems. Curr Opin Neurobiol 20:1–8
Majaranta, P. (2012). Communication and text entry by gaze. Gaze interaction and applications of eye tracking: Advances in assistive technologies, 63-77.
Majaranta, P., & Bulling, A. (2014). Eye tracking and eye-based human–computer interaction. In Advances in physiological computing(pp. 39-65). Springer, London.
Murata, A. (2006). Eye‐gaze input versus mouse: Cursor control as a function of age. International Journal of Human‐Computer Interaction,21(1), 1-14.
Zander, T. O., Gaertner, M., Kothe, C., & Vilimek, R. (2010). Combining eye gaze input with a brain–computer interface for touchless human–computer interaction. Intl. Journal of Human–Computer Interaction, 27(1), 38-51.
Written by Harrison McNaughtan, Last Revision May 2018