%0 Journal Article %J Cognition %D 2017 %T Predicting actions from subtle preparatory movements %A Maryam Vaziri-Pashkam %A Sarah Cormiea %A Ken Nakayama %K Action prediction %K Action reading %K Biological motion %K Competitive interaction %K Motor interaction %X

To study how people anticipate others’ actions, we designed a competitive reaching task. Subjects faced each other separated by a Plexiglas screen and their finger movements in 3D space were recorded with sensors. The first subject (Attacker) was instructed to touch one of two horizontally arranged targets on the screen. The other subject (Blocker) touched the same target as quickly as possible. Average finger reaction times (fRTs) were fast, much faster than reactions to a dot moving on the screen in the same manner as the Attacker’s finger. This suggests the presence of subtle preparatory cues in other parts of the Attacker’s body. We also recorded videos of Attackers’ movements and had Blockers play against unedited videos as well as videos that had all preparatory cues removed by editing out frames before Attacker finger movements started. Blockers’ fRTs in response to the edited videos were significantly slower (∼90 ms). Also, reversing the preparatory movements in the videos tricked the Blockers into choosing the incorrect target at the beginning of their movement. Next, we occluded various body parts of the Attacker and showed that fRTs slow down only when most of the body of the Attacker is occluded. These results indicate that informative cues are widely distributed over the body and Blockers can use any piece from a set of redundant cues for action prediction. Reaction times in each condition remained constant over the duration of the testing sessions indicating a lack of learning during the experiment. These results suggest that during a dynamic two-person interaction, human subjects possess a remarkable and built-in action reading capacity allowing them to predict others’ goals and respond efficiently in this competitive setting.

%B Cognition %V 168 %P 65 - 75 %8 01/2017 %G eng %U http://www.sciencedirect.com/science/article/pii/S0010027717301762 %R 10.1016/j.cognition.2017.06.014 %0 Generic %D 2015 %T Predicting actions before they occur %A Maryam Vaziri-Pashkam %E Sarah Cormiea %Y Ken Nakayama %K Action anticipation %K Action reading %K Biological motion %K Social interaction %X
Humans are experts at reading others’ actions in social contexts. They efficiently process others’ movements in real-time to predict intended goals. Here we designed a two-person reaching task to investigate real-time body reading in a naturalistic setting. Two Subjects faced each other separated by a plexiglass screen. One (Attacker) was instructed to tap one of two targets on the screen and the other (Blocker) was told to tap the same target as quickly as possible. Reaction times were fast, much faster than reaction times to a dot projected on the screen moving in the same manner. This suggests Blockers use subtle preparatory movements of Attackers to predict their goal. Next, using video recordings of an Attacker, we showed that removing the preparatory cues slows reaction times and changing them could trick the Blockers to choose the wrong target. We then occluded various body parts of the Attacker and showed that reaction times slow down only when most of the body of the Attacker is occluded. This suggests that preparatory cues are distributed over the body of the Attacker. We saw no evidence of learning during the experiment as reaction times remained constant over the duration of the session. Taken together, these results suggest that in social contexts humans are able to use their knowledge of the biomechanical constraints on the human body to efficiently process preparatory cues from the body of their interaction partner in order to predict their intentions well before movement begins.
%8 10/27/2015 %2

http://hdl.handle.net/1721.1/100202

%0 Generic %D 2015 %T Unconscious perception of an opponent's goal %A Sarah Cormiea %A Maryam Vaziri-Pashkam %A Ken Nakayama %X

Humans are experts at reading others’ actions. They effortlessly navigate a crowded street or reach for a handshake without grabbing an elbow. This suggests real-time, efficient processing of others’ movements and the ability to predict intended future movements. We designed a competitive reaching task where two subjects faced each other separated by a plexiglass screen. Fingertip positions were recorded with magnetic sensors. One subject (Attacker) was instructed via headphones to tap one of two targets on the screen and the other subject (Blocker) was told to try to reach the same target as quickly as possible. Reaction times, measured as the difference in initial finger movement (finger launch) of Attacker and Blocker were fast (~150ms): much faster than reaction times to a moving dot projected on the screen (~250ms). This suggests Blockers use preparatory actions of Attackers to predict their goal before finger launch. Next, we videotaped an Attacker and projected the video onto the transparent screen. Blockers’ reaction times to the videos matched those to a real Attacker. In half the blocks we cut the preparatory information from the video. Blockers were ~120ms slower responding to cut videos, suggesting preparatory information is predictive ~120ms before finger launch. Finally we played the videos from the start to various times relative to finger launch and asked subjects to report the Attacker’s goal with button presses. Surprisingly when videos were cut at ~120ms before finger launch subjects’ accuracy was ~70%: significantly lower than the accuracy of arm movements in response to full videos (~98%). This suggests that in the arm movement task subjects utilize implicit information that is not consciously accessible in the button press task. Taken together, these results suggest participants in a competitive interaction have implicit or unconscious knowledge of the intentions of their partner before movement begins.

 
%B Vision Sciences Society Annual Meeting (VSS 2015) %8 09/2015 %U http://jov.arvojournals.org/article.aspx?articleid=2433081 %R 10.1167/15.12.43