Saliency3D: A 3D Saliency Dataset Collected on Screen (Dataset and Experiment Application)

DOI

While visual saliency has recently been studied in 3D, the experimental setup for collecting 3D saliency data can be expensive and cumbersome. To address this challenge, we propose a novel experimental design that utilizes an eye tracker on a screen to collect 3D saliency data. Our experimental design reduces the cost and complexity of 3D saliency dataset collection. We first collect gaze data on a screen, then we map them to 3D saliency data through perspective transformation. Using this method, we collect a 3D saliency dataset (49,276 fixations) comprising 10 participants looking at sixteen objects. Moreover, we examine the viewing preferences for objects and discuss our findings in this study. Our results indicate potential preferred viewing directions and a correlation between salient features and the variation in viewing directions. The files of this dataset are documented in README.md.

Identifier
DOI https://doi.org/10.18419/darus-4101
Related Identifier IsCitedBy https://doi.org/10.1145/3649902.3653350
Metadata Access https://darus.uni-stuttgart.de/oai?verb=GetRecord&metadataPrefix=oai_datacite&identifier=doi:10.18419/darus-4101
Provenance
Creator Wang, Yao ORCID logo; Bulling, Andreas ORCID logo
Publisher DaRUS
Contributor Bulling, Andreas; Dai, Qi; Wang, Yao
Publication Year 2024
Funding Reference DFG 251654672
Rights info:eu-repo/semantics/openAccess
OpenAccess true
Contact Bulling, Andreas (Universität Stuttgart)
Representation
Resource Type eye gaze; Dataset
Format application/zip; text/markdown
Size 426652847; 1804
Version 1.0
Discipline Other