%0 Generic %D 2016 %T Measuring and modeling the perception of natural and unconstrained gaze in humans and machines %A Daniel Harari %A Tao Gao %A Nancy Kanwisher %A Joshua B. Tenenbaum %A Shimon Ullman %K computational evaluation %K computational modeling %K Computer vision %K empirical evaluation %K estimation of gaze direction %K Gaze perception %K joint attention %K Machine Learning %X

Humans are remarkably adept at interpreting the gaze direction of other individuals in their surroundings. This skill is at the core of the ability to engage in joint visual attention, which is essential for establishing social interactions. How accurate are humans in determining the gaze direction of others in lifelike scenes, when they can move their heads and eyes freely, and what are the sources of information for the underlying perceptual processes? These questions pose a challenge from both empirical and computational perspectives, due to the complexity of the visual input in real-life situations. Here we measure empirically human accuracy in perceiving the gaze direction of others in lifelike scenes, and study computationally the sources of information and representations underlying this cognitive capacity. We show that humans perform better in face-to-face conditions compared with recorded conditions, and that this advantage is not due to the availability of input dynamics. We further show that humans are still performing well when only the eyes-region is visible, rather than the whole face. We develop a computational model, which replicates the pattern of human performance, including the finding that the eyes-region contains on its own, the required information for estimating both head orientation and direction of gaze. Consistent with neurophysiological findings on task-specific face regions in the brain, the learned computational representations reproduce perceptual effects such as the Wollaston illusion, when trained to estimate direction of gaze, but not when trained to recognize objects or faces.

%8 11/2016 %1

arXiv:1611.09819

%2

http://hdl.handle.net/1721.1/105477