We use an eye tracker - a device for measuring eye positions and eye movement and so detecting the user's gaze direction - to design new interfaces.
Interfaces operated through the eyes are of great help for people with severe disabilities, allowing them to use their gaze to identify, or even move, objects on the screen, as well as to write. But eye-tracking is also studied and applied in several contexts besides that of an input means for interfaces. In our research we consider it both for implementing explicit/implicit interfaces and as a helpful means for the evaluation of web sites, usability issues, information presentation modes and visual interactions in general, useful, for example, for interactive and more involving museum experiences. Also, we are studying the effectiveness of existing and new RSVP (Rapid Serial Visual Presentation) image visualization methods, which involve strong eye activity. Soft Biometrics, Automotive, Assistive and Persuasive Technologies are other examples of fields for application.
Summarizing, we are using eye tracking:
- as an assistive technology or an additional input channel (e.g., to write, surf the web, play music, etc.);
- to study the driver’s performance using a wearable eye tracker in collaboration with Automotive Safety Centre;
- to study the gaze in order to analyze and understand the user’s behavior and cognitive state while interacting with different kinds of visual stimuli;
- to identify or verify the identity of people from the way they look at specific stimuli (e.g., faces, shapes) for gaze-based soft biometrics.
We have developed the following Eye Tracking Applications:
- Eye-S, a system that allows input to be provided to the computer through a pure eye-based approach;
- Netytar, a gaze-based Virtual Digital Musical Instrument (Virtual DMI), usable by both motor-impaired and able-bodied people, controlled through an eye tracker and a "switch", to play music with the eyes;
- Gaze-Based Web Browser;
- e5Learning, an e-learning environment where eye tracking is used to observe user behavior, so as to adapt content presentation in real-time.
Our latest Journal publications on Eye Tracking
- Piercarlo Dondi, Samuel Sapuppo, Marco Porta (2024). Leyenes: A gaze-based text entry method using linear smooth pursuit and target speed, International Journal of Human-Computer Studies, Volume 184, 103204, ISSN 1071-5819, DOI:10.1016/j.ijhcs.2023.103204.
- Piercarlo Dondi, Marco Porta (2023). Gaze-Based Human–Computer Interaction for Museums and Exhibitions: Technologies, Applications and Future Perspectives, in Electronics, 12(14):3064, DOI:10.3390/electronics12143064.
- Marco Porta, Piercarlo Dondi, Alice Pianetta, Virginio Cantoni (2022). SPEye: A Calibration-Free Gaze-Driven Text Entry Technique Based on Smooth Pursuit, in IEEE Transactions on Human-Machine Systems, vol. 52, no. 2, pp. 312-323, April 2022, DOI:10.1109/THMS.2021.3123202.
- Piercarlo Dondi, Marco Porta, Angelo Donvito, Giovanni Volpe (2022). A gaze-based interactive system to explore artwork imagery, in Journal on Multimodal User Interfaces, 16, pp. 55-67, Springer, DOI:10.1007/s12193-021-00373-z.
- Marco Porta, Piercarlo Dondi, Nicola Zangrandi, and Luca Lombardi (2022). Gaze-Based Biometrics From Free Observation of Moving Elements, in IEEE Transactions on Biometrics, Behavior, and Identity Science, vol. 4, no. 1, pp. 85-96, DOI:10.1109/TBIOM.2021.3130798.
See also a full list of our
Publications