PHASE: PHysically-grounded Abstract Social Events for Machine Social Perception

TitlePHASE: PHysically-grounded Abstract Social Events for Machine Social Perception
Publication TypeCBMM Memos
Year of Publication2021
AuthorsNetanyahu, A, Shu, T, Katz, B, Barbu, A, Tenenbaum, JB
Number123
Date Published03/2021
Abstract

The ability to perceive and reason about social interactions in the context of physical environments
is core to human social intelligence and human-machine cooperation. However, no prior dataset or
benchmark has systematically evaluated physically grounded perception of complex social interactions
that go beyond short actions, such as high-fiving, or simple group activities, such as gathering. In this
work, we create a dataset of physically-grounded abstract social events, PHASE, that resemble a wide
range of real-life social interactions by including social concepts such as helping another agent. PHASE
consists of 2D animations of pairs of agents moving in a continuous space generated procedurally
using a physics engine and a hierarchical planner. Agents have a limited field of view, and can interact
with multiple objects, in an environment that has multiple landmarks and obstacles. Using PHASE,
we design a social recognition task and a social prediction task. PHASE is validated with human
experiments demonstrating that humans perceive rich interactions in the social events, and that the
simulated agents behave similarly to humans. As a baseline model, we introduce a Bayesian inverse
planning approach, SIMPLE (SIMulation, Planning and Local Estimation), which outperforms state-of-
the-art feedforward neural networks. We hope that PHASE can serve as a difficult new challenge for
developing new models that can recognize complex social interactions.

DSpace@MIT

https://hdl.handle.net/1721.1/141341

Download:  PDF icon CBMM-Memo-123.pdf
CBMM Memo No:  123

Associated Module: 

CBMM Relationship: 

  • CBMM Funded