CVMLab’s studies on Human-Computer Interaction (HCI) exploit different devices to design new interfaces that satisfy the user’s needs. In particular, this activity is now focused on gaze interaction realized by an eye tracker, a device for measuring eye positions and eye movement and so detecting the user’s gaze direction.
Interfaces operated through the eyes are of great help for people with severe disabilities, allowing them to use their gaze to identify, or even move, objects on the screen, as well as to write. But eye-tracking is also studied and applied in several contexts besides that of an input means for interfaces. In our research we consider it both for implementing explicit/implicit interfaces and as a helpful means for the evaluation of web sites, usability issues, information presentation modes and visual interactions in general, useful, for example, for interactive and more involving museum experiences. Also, we are studying the effectiveness of existing and new RSVP (Rapid Serial Visual Presentation) image visualization methods, which involve strong eye activity. Soft Biometrics, Automotive, Assistive and Persuasive Technologies are other examples of fields for application.
Summarizing, we are using eye tracking:
- As an assistive technology or an additional input channel (e.g., to write, surf the web, play music, etc.)
- To study the driver’s performance using a wearable eye tracker in collaboration with Automotive Safety Centre
- To study the gaze in order to analyze and understand the user’s behavior and cognitive state while interacting with different kinds of visual stimuli
- To identify or verify the identity of people from the way they look at specific stimuli (e.g., faces, shapes) for gaze-based soft biometrics
We have developed the following
Eye Tracking Applications:
- Eye-S, a system that allows input to be provided to the computer through a pure eye-based approach
- Netytar, a gaze-based Virtual Digital Musical Instrument (Virtual DMI), usable by both motor-impaired and able-bodied people, controlled through an eye tracker and a “switch”, to play music with the eyes
- A Gaze-Based Web Browser
- e5Learning, an e-learning environment where eye tracking is used to observe user behavior, so as to adapt content presentation in real-time
We have also developed a gaze-controlled system for handless interaction with artworks (see here a video about the application) for a satellite event of Expo 2015, the temporary exhibition “1525-2015. Pavia, the Battle, the Future. 1525-2015 Nothing was the same again”, held at the Visconti Castle in Pavia. Without mouse or keyboard, but just using their eyes, visitors could explore seven famous tapestries representing the Battle of Pavia of 1525: they could make enlargements and scrolling operations, and view information on specific subjects of each tapestry as they looked at them. At the end of the exploration the visitors could also review their gaze replay, a movie showing a sequence of fixations on the areas of the tapestry on which the user’s eyes focused.
Latest publications
- Piercarlo Dondi, Samuel Sapuppo, Marco Porta (2024). Leyenes: A gaze-based text entry method using linear smooth pursuit and target speed, International Journal of Human-Computer Studies, Volume 184, 103204, ISSN 1071-5819, DOI:10.1016/j.ijhcs.2023.103204.
- Riccardo Crescenti, Piercarlo Dondi, Marco Porta, Cristiano Resta, Gianfranco Rotondo (2023). An Eye Tracking Based Evaluation Protocol and Method for In-Vehicle Infotainment Systems, IEEE 28th International Conference on Emerging Technologies and Factory Automation (ETFA), Sinaia, Romania, 2023, pp. 1-4, DOI:10.1109/ETFA54631.2023.10275542.
- Le Hoang Nam, Marco Porta (2023). A Study on Eye Tracking for Mobile Devices Using Deep Learning, in Proceedings of the 24th International Conference on Computer Systems and Technologies - CompSysTech’23, University of Ruse (Bulgaria), 16-17 June 2023, Session B: Artificial Intelligence and Machine Learning, pp. 65-69, Association for Computing Machinery, New York, NY, United States, DOI:10.1145/3606305.3606326.
- Aleksandra Klasnja-Milicevic, Mirjana Ivanovic, Marco Porta (2023). A Gaze-Based Intelligent Textbook Manager, in S.C. Mukhopadhyay, S.N.A. Senanayake, P.C. Withana (eds), Proceedings of CITISIA 2022 (7th International Conference on Innovative Technologies in Intelligent Systems and Industrial Applications), online, 16-18 November 2022, Lecture Notes in Electrical Engineering, vol 1029, 2023, Springer, Cham, DOI: 10.1007/978-3-031-29078-7_31.
See also a full list of our
Publications