A researcher at Stanford University has created an application that allows users to interact with a computer simply by looking at the screen and tapping a single key.
Manu Kumar, a 32 year-old doctoral student at Stanford, developed the EyePoint software as part of the Gaze-enhanced User Interface Design (Guide) project.
The research aims to produce standard eye-tracking hardware that allows users to perform basic mouse operations with a combination of gaze and hotkeys.
If a user wants to select a hyperlink to a website, for example, they hold down a predefined button and the part of the screen being viewed will become magnified. The user then focuses on the desired link and releases the key to ‘click’ on the link.
“Eye-tracking technology was developed for disabled users, but we are trying to get it to a point where it becomes more useful for able-bodied users,”
Combining the use of a hand and eyes may be a more natural way of doing things than relying on gaze alone, according to Kumar. Kumar’s system has no visible cursor or any other feedback system, which studies suggest normally distract users and diminish performance as operators attempt constantly to control the cursor’s location.
Current eye-tracking hardware uses a high resolution camera and a collection of infrared LEDs to pick up the movement of the pupil and the reflection of the infrared light from the cornea.
This hardware is expensive, but as many of today’s monitors come with cameras built in the technology could soon become cheap enough to be widely available.
Related Links: Stanford University