%0 Journal Article %J Cognition %D 2017 %T Predicting actions from subtle preparatory movements %A Maryam Vaziri-Pashkam %A Sarah Cormiea %A Ken Nakayama %K Action prediction %K Action reading %K Biological motion %K Competitive interaction %K Motor interaction %X

To study how people anticipate others’ actions, we designed a competitive reaching task. Subjects faced each other separated by a Plexiglas screen and their finger movements in 3D space were recorded with sensors. The first subject (Attacker) was instructed to touch one of two horizontally arranged targets on the screen. The other subject (Blocker) touched the same target as quickly as possible. Average finger reaction times (fRTs) were fast, much faster than reactions to a dot moving on the screen in the same manner as the Attacker’s finger. This suggests the presence of subtle preparatory cues in other parts of the Attacker’s body. We also recorded videos of Attackers’ movements and had Blockers play against unedited videos as well as videos that had all preparatory cues removed by editing out frames before Attacker finger movements started. Blockers’ fRTs in response to the edited videos were significantly slower (∼90 ms). Also, reversing the preparatory movements in the videos tricked the Blockers into choosing the incorrect target at the beginning of their movement. Next, we occluded various body parts of the Attacker and showed that fRTs slow down only when most of the body of the Attacker is occluded. These results indicate that informative cues are widely distributed over the body and Blockers can use any piece from a set of redundant cues for action prediction. Reaction times in each condition remained constant over the duration of the testing sessions indicating a lack of learning during the experiment. These results suggest that during a dynamic two-person interaction, human subjects possess a remarkable and built-in action reading capacity allowing them to predict others’ goals and respond efficiently in this competitive setting.

%B Cognition %V 168 %P 65 - 75 %8 01/2017 %G eng %U http://www.sciencedirect.com/science/article/pii/S0010027717301762 %R 10.1016/j.cognition.2017.06.014