PHASE: PHysically-grounded Abstract Social Events for Machine Social Perception

PHASE: PHysically-grounded Abstract Social Events for Machine Social Perception

Date Posted:  December 16, 2020
Date Recorded:  December 12, 2020
CBMM Speaker(s):  Aviv Netanyahu
  • All Captioned Videos
  • SVRHM Workshop 2020
Description: 

Project website: https://www.tshu.io/PHASE/

The ability to perceive and reason about social interactions in the context of physical environments is core to human social intelligence and human-machine cooperation. However, no prior dataset or benchmark has systematically evaluated physically grounded perception of complex social interactions that go beyond short actions, such as high-fiving, or simple group activities, such as gathering. In this work, we create a dataset of physically-grounded abstract social events, PHASE, that resemble a wide range of real-life  social interactions by including social concepts such as helping another agent. As a baseline model, we introduce a Bayesian inverse planning approach which outperforms SOTA feed-forward neural networks.  We hope that PHASE can serve as a difficult new challenge for developing new models that can recognize complex social interactions.