The precise description of this dataset, as well as of the experimental procedure from which it was obtained, is provided in the following paper.
Please note that this dataset may only be used for research, educational, and non-commercial purposes. If you use it, please cite:
Dondi, P., Lê, H.N., Gentilini, R., Porta M., Gaze-based biometric authentication using symmetric dynamic stimuli. Multimedia Tools and Applications, 85, 393 (2026), https://doi.org/10.1007/s11042-026-21622-x.
Six animations were used as stimuli (Vertical Single, Vertical Multiple, Horizontal Single,
Horizontal Multiple, Diagonal, and Radial), each lasting 10 seconds.
Thirty-six participants were involved, and each attended the experimental test nine times,
subdivided into three sessions (each session in a different day for each participant).
In each session, the participant repeated the test three times ("trials").
In each trial, the six animations were displayed in random order, one after the other.
Before each animation, a small white cross was displayed in the center of the screen for
two seconds.
No calibration procedure of the eye tracker was performed.
The GazePoint GP3 HD eye tracker (150 Hz) was employed to record gaze data. The meanings of the columns in the .csv files of the dataset are described in the document OPEN GAZE API BY GAZEPOINT.
The dataset contains one file for each trial and each animation, i.e., in total, 1,944 files (36 participants, three sessions, and
three trials per session with six stimuli each).
The file name structure is
pptttestoanimation.csv
, where:
>>> Download the dataset (154 MB)