100 years of eye tracking

100 years of eye-gaze tracking

Deepak Akkil
Gaze tracking
Published in
7 min readJul 19, 2019

--

The technology of eye gaze tracking has gone through over a century of research and development. The interest in knowing why and how our eyes moves have intrigued humans many centuries back, and many brilliant minds have devised technological devices to help measure and analyse the characteristics of our eye movements.

Eye/Gaze tracking is now at the verge of becoming a mainstream tech, with interest from all the major tech companies in the consumer market. In this article, I will present the history of the technology and its journey towards mainstream maturity.

100 years of eye tracking

The earliest eye movement sensing devices were large in size, expensive and required a direct physical connection to the eyes (e.g. the devices developed by Lamare and Delabarre [15]). They used eye cups made from plaster of paris, and required the subject to be anaesthetised (using cocaine!) [15].

In 1901, Dodge and Cline [3] developed ‘Dodge photochronograph’, the earliest non-invasive gaze tracker that recorded the movement of the reflected light from cornea to a photographic plate. Their system had two major shortcomings: it recorded eye movements only in the horizontal direction and required head to be placed as still as possible.

This was followed by the development of the first EOG-based eye position sensor in the 1920s [12]. Electro-oculography relies on the electrostatic charge difference between the cornea and the retina of the eye [12]. Cornea of the eye is 0.40mV to 1.0mV positively charged relative to the retina. As the eyes move, the electric dipole moves with it, causing a variation in electric potential around the eyes. Skin electrodes strategically placed around the eyes can detect this variation in electrical potential, to measure the movement of the eyes in relation to the head.

The year 1947 saw an interesting application of eye movement data. Fitts et al. [4] observed a series of images of pilots, looking at cockpit controls in an aircraft while approaching the landing runway. They manually estimated the point of gaze of the pilots from the images. The purpose of their study was to understand the problems associated with the layout of cockpit instrumentation and to make the arrangement easier to read. Their study focussed on answering specific questions, such as do pilots look at some instruments more often than others and do some instruments take more time to read than others. This marks an important milestone in the field of Human Computer Interaction, as one of the earliest uses of gaze information, to understand and improve human-machine interaction.

The years between 1948 and 1974 witnessed several key developments in the technology of gaze tracking and our understanding of eye movements. The development of the head-mounted gaze tracker [5], the scleral search coil gaze tracker [13] and remote gaze-tracking system [9;12] are examples of the technological advancements that took place during this period.

In the year 1965, the influential work by Alfred Yarbus (1968) [15] made a significant contribution to our understanding of eye movements and visual perception. Yarbus showed that the movement of the eyes are not just influenced by the visual scene (e.g. objects in the environment), but also by the information needs of the task of the observer.

Eye movement recorded used by Yarbus Source

The advent of personal computers in the late 1980s led to an increased interest in gaze tracking, as a technology to assist severely disabled users to interact with computing devices. For example, severely disabled users type letters on a computer screen by looking at specific letters on the onscreen keyboard. The use of gaze tracking as an assistive technology represents its first use to directly interact with computing devices, in real time [11].

During the 1990s and early 2000s, many studies focussed on use of gaze tracking for mainstream HCI. Some notable research during this period included the MAGIC pointing [16] and iDict translation aid [6]. MAGIC pointing combines gaze and mouse for pointing, enabling fine target selection with mouse, while eliminating large mouse movements using the gaze of the user . iDict is a reading aid that monitors a reader’s comprehension of foreign text and provides real time translation of difficult words, based on the gaze pattern .

The technological advancements in the more recent past have enabled significant improvements in gaze tracking, in terms of accuracy of tracking and ergonomics of use. A glance at the recent advancements in the field reveals two noticeable trends. First, a growing shift in the HCI gaze research community towards everyday gaze tracking and mobile gazebased interaction [2]. Second, growing availability of affordable gaze-tracking devices in the commercial market, for mainstream users (e.g. Tobii 4C, FOVE, HTC Vive Pro).

In recent years, there is a growing trend in the HCI gaze research community to move beyond stationary use scenarios towards everyday gaze tracking and pervasive gaze-based interaction. The feasibility and benefits of gaze trackers integrated to mobile phones, and wearable devices such as smartglasses and smartwatches, are being investigated. Wearable gaze trackers enable a scenario where the gaze of a user is tracked continuously in a wide range of use scenarios and provides opportunities for innovative applications. For example, hassle-free gazebased interaction with the internet of things (IOT), hands-free interaction with smartwatches [1], applications for self-quantifying activities such as reading by counting the number of words read [8], and pervasive mental-health monitoring systems [14].

Gaze tracking continues to be used as a research tool in controlled lab environments and as an assistive device. At the same time, newer commercial application domains and scenarios are starting to emerge. The year 2013 marks a significant milestone in the field of gaze tracking. In 2013, Eye Tribe, a Danish start-up, launched the first commercial, low cost, remote gaze tracker, marking a significant step towards making the technology more accessible to the consumer market. Eye Tribe was targeted mainly at software developers, to incorporate gaze tracking into their applications. In 2014, Tobii launched Tobii EyeX, the first gazetracking device designed for gaming applications for the mainstream consumer market. In the same year, FOVE released the first commercial virtual reality headset with built-in gaze tracking. In 2017, Microsoft announced eye control for Windows 10, moving a leap forward in the assistive capability of the Windows OS.

Eye tracking is rapidly maturing to be mainstream consumer tech. In the year 2019, it increasingly seems eye tracking is going to be a key part of our future Augmented reality and virtual reality experiences. Microsoft Hololens 2 , HTC Vive Pro , Magic leap one and Varjo XR1 are only some of the AR/VR devices that launched their headsets with built-in eye-tracking.

Where will eye tracking go from here? Lets have our eyes on the future!

[1] Akkil, D. et al. (2015). Glance Awareness and Gaze Interaction in Smartwatches. Proceedings of CHI 2015
[2] Bulling, A., & Gellersen, H. (2010). Toward mobile eye-based human-computer interaction. IEEE Pervasive Computing.
[3] Dodge, R., & Cline, T. S. (1901). The angle velocity of eye movements. Psychological Review, 8(2).
[4] Fitts, P. et al. (1950). Eye movements of aircraft pilots during instrument-landing approaches. Aeronautical Engineering Review.
[5] Hartridge, H., & Thomson, L. C. (1942). Method of investigating eye movements. British Journal of Opthalmology, 581–591.
[6] Hyrskykari, A. et al. (2000). Design issues of iDICT: a gaze-assisted translation aid. Proceedings of ETRA.
[7] Jacob, R. J. K., & Karn, K. S. (1958). Eye tracking in human-computer interaction and usability research : Ready to deliver the promises.
[8] Kunze, K. et al. (2015). How Much Do You Read? — Counting the Number of Words a User Reads Using Electrooculography. Augmented Humans.
[9] Lambert, R. H. et al. (1974). High-speed data processing and unobtrusive monitoring of eye movements. Behavior Research Methods & Instrumentation, 6(6), 525–530.
[10] Levy-Schoen, A. (1968). Eye movements and vision. Neuropsychologia, 6(4), 389–390.
[11] Majaranta, P. & Räihä, K. (2002). Twenty years of eye typing: systems and design issues. Proceedings of ETRA 2002.
[12] Merchant, J. et al. (1974). Remote Measurement of Eye Direction Allowing Subject Motion Over One Cubic Foot of Space. IEEE Transactions on Biomedical Engineering.
[12] Mowrer, O. et al. (2017). The corneo-retinal potential difference as the basis of the galvanometric method of recording eye movements. American Journal of Physiology-Legacy.
[13] Robinson, D. (1963). Movement Using a Scieral Search in a Magnetic Field. IEEE Transactions on Bio-Medical Electronics, 10, 137–145.
[14] Vidal, M. et al. (2012). Wearable eye tracking for mental health monitoring. Computer Communications, 35(11), 1306–1311.
[15] Wade, N. J. (2015). How Were Eye Movements Recorded Before Yarbus? Perception, 44(8–9), 851–883.
[16] Zhai, S. et al. (1999). Manual and gaze input cascaded (MAGIC) pointing. CHI 1999.

--

--

Deepak Akkil
Gaze tracking

Eye tracking, Human Visual Perception, AR/VR, UX