VIP: A unifying framework for eye-gaze research

TitleVIP: A unifying framework for eye-gaze research
Publication TypeDataset
Year of Publication2013
AuthorsMa, K-T, Sim, T, Kankanhalli, M

We have collected the first fixation dataset which captures all 3 VIP factors.

The images were selected from the NUSEF dataset, which contains both neutral and affective images. Out of 758 NUSEF images, 150 were randomly selected. 75 subjects were recruited from a mixture of undergraduate, postgraduate and working adult population. The male and female subjects are recruited separately to ensure an even distribution. They were tasked to view the 150 images in a free-viewing (i.e. without assigned task) or anomaly detection setting. Each image was displayed for 5 seconds, followed by 2 seconds viewing of a gray screen. The images were displayed in random order. Their eye-gaze data was recorded with a binocular infra-red based remote eye-tracking device SMI RED 250. The recording was done at 120Hz. The subjects were seated at 50 centimeters distance from a 22 inch LCD monitor with 1680x1050 resolution. This setup is similar to other ones used in eye-gaze research.

Before start of the viewing experiment, the subjects also provided their demographic data: gender, age-group, ethnicity, religion, field of study/work, highest education qualification, income group, expenditure group and nationality. 3 personality type questions are posed based on the Jung's Psychological types. The recorded eye-gaze data were preprocessed with the SMI SDK to extract the fixations from the preferred eye as chosen by the subjects.

Download and copyright

The VIP dataset can be downloaded as a single zip file. VIP dataset is available for research purposes only. By downloading or using the dataset, you are deemed to agree to terms and conditions.

If you are using this dataset, please cite:

A Unifying Framework for Computational Eye-Gaze Research.
Keng-Teck Ma. Terence Sim and Mohan Kankanhalli.
4th International Workshop on Human Behavior Understanding. Barcelona, Spain. 2013.[pdf]

Citation Key437

Research Area: