This record contains the eye and scene camera video files of our gaze tracking experiments that were performed during Autumn 2016.
The number of participants is 19. The subjects sat in a chair in a dimly lit room and viewed a display while wearing our self-made gaze tracking glasses whose camera streams were recorded. The subjects were asked to sit relaxed and hold their head still during the measurements. The subjects viewed three different displays with three viewing distances (24” monitor at 60 cm, 46” HD TV at 1.2 m, and projector screen at 3.0 m) so that at each distance the resolution of the stimulus was adequate and the viewed stimuli spanned a similar visual angle. A test phase was performed for each three viewing distances. In addition, one of the viewing distances begun with a calibration phase. The presentation order of the displays and the calibration distance was permuted between the participants and each calibration distance was used equally often.
The calibration procedure cycled a stimulus dot through nine different locations on the selected calibration distance. The duration of the fixation stimuli was jittered between 2–3 s to prevent anticipatory gaze shifts. To decrease artifacts in the evaluation data, after every third location the dot changed from black to gray for three seconds signaling the subject to blink freely, while avoiding blinks during the rest of the sequence. The calibration also included a 20-second free viewing task which was performed first.
The test phase comprised two tasks: a saccade task and a smooth pursuit task. The saccade task included 25 stimulus locations forming a regular 5 × 5 grid with a random presentation order (however, matching between subjects). Again, blinking was discouraged except after every third stimulus location ending with a blink pause. The smooth pursuit task presented a dot moving with a constant velocity of 3.0 degrees per second. In each corner, the dot stopped for the blink pause.
In each phase and for each distance, the size of the dot was one degree and the dot grid spanned an area of 24 degrees in both directions. Note that the scene camera is upside down due to the geometry of the used frames.
The description of our gaze tracking algorithm and our results for this dataset are published in the Journal of Eye Movement Research (2017). There you will also find a more detailed description of the experimental setup. If you publish your own results of this dataset, please cite this reference.
The files are named as follows: sg[N_subject]_phase[N_phase]_cam[N_cam].mp4 where N_subject is the number of the subject (1,...,19), [N_phase] is the number of the phase (1 for monitor, 2 for TV, and 3 for projector screen), and [N_cam] is the number of the camera (1 and 2 for the left and right eye cameras and 3 for the scene camera). Due to human errors, six files are missing: sg11_phase2_cam.mp4 and sg13_phase1_cam.mp4.
The record includes also the gaze videos for participant no. 1 as processed by our gaze tracking algorithms. These video files are named ooga_gazevideo_sg01_phase*.mp4.