PHASE: PHysically-grounded Abstract Social Eventsfor Machine Social Perception

TitlePHASE: PHysically-grounded Abstract Social Eventsfor Machine Social Perception
Publication TypeConference Paper
Year of Publication2020
AuthorsNetanyahu, A, Shu, T, Katz, B, Barbu, A, Tenenbaum, JB
Conference NameShared Visual Representations in Human and Machine Intelligence (SVRHM) workshop at NeurIPS 2020
Date Published12/2020
Abstract

The ability to perceive and reason about social interactions in the context ofphysical environments is core to human social intelligence and human-machinecooperation. However, no prior dataset or benchmark has systematically evaluatedphysically grounded perception of complex social interactions that go beyondshort actions, such as high-fiving, or simple group activities, such as gathering.In this work, we create a dataset of physically-grounded abstract social events,PHASE, that resemble a wide range of real-life social interactions by includingsocial concepts such as helping another agent. PHASE consists of 2D animationsof pairs of agents moving in a continuous space generated procedurally using aphysics engine and a hierarchical planner. Agents have a limited field of view, andcan interact with multiple objects, in an environment that has multiple landmarksand obstacles. Using PHASE, we design a social recognition task and a social prediction task. PHASE is validated with human experiments demonstrating thathumans perceive rich interactions in the social events, and that the simulated agents behave similarly to humans. As a baseline model, we introduce a Bayesian inverse planning approach, SIMPLE (SIMulation, Planning and Local Estimation), which outperforms state-of-the-art feed-forward neural networks. We hope that PHASEcan serve as a difficult new challenge for developing new models that can recognize complex social interactions.

URLhttps://openreview.net/forum?id=_bokm801zhx

Associated Module: 

CBMM Relationship: 

  • CBMM Funded