%0 Journal Article %J Developmental Science %D 2023 %T Preliminary evidence for selective cortical responses to music in one‐month‐old infants %A Heather L Kosakowski %A Norman‐Haignere, Samuel %A Mynick, Anna %A Takahashi, Atsushi %A Saxe, Rebecca %A Nancy Kanwisher %K auditory cortex %K fMRI %K infants %K music %K speech %X

Prior studies have observed selective neural responses in the adult human auditory cortex to music and speech that cannot be explained by the differing lower-level acoustic properties of these stimuli. Does infant cortex exhibit similarly selective responses to music and speech shortly after birth? To answer this question, we attempted to collect functional magnetic resonance imaging (fMRI) data from 45 sleeping infants (2.0- to 11.9-weeks-old) while they listened to monophonic instrumental lullabies and infant-directed speech produced by a mother. To match acoustic variation between music and speech sounds we (1) recorded music from instruments that had a similar spectral range as female infant-directed speech, (2) used a novel excitation-matching algorithm to match the cochleagrams of music and speech stimuli, and (3) synthesized “model-matched” stimuli that were matched in spectrotemporal modulation statistics to (yet perceptually distinct from) music or speech. Of the 36 infants we collected usable data from, 19 had significant activations to sounds overall compared to scanner noise. From these infants, we observed a set of voxels in non-primary auditory cortex (NPAC) but not in Heschl’s Gyrus that responded significantly more to music than to each of the other three stimulus types (but not significantly more strongly than to the background scanner noise). In contrast, our planned analyses did not reveal voxels in NPAC that responded more to speech than to model-matched speech, although other unplanned analyses did. These preliminary findings suggest that music selectivity arises within the first month of life.

%B Developmental Science %8 03/2023 %G eng %U https://onlinelibrary.wiley.com/doi/10.1111/desc.13387 %! Developmental Science %R 10.1111/desc.13387 %0 Journal Article %J NeuroImage %D 2020 %T The speed of human social interaction perception %A Leyla Isik %A Mynick, Anna %A Pantazis, Dimitrios %A Nancy Kanwisher %X

The ability to perceive others’ social interactions, here defined as the directed contingent actions between two or more people, is a fundamental part of human experience that develops early in infancy and is shared with other primates. However, the neural computations underlying this ability remain largely unknown. Is social interaction recognition a rapid feedforward process or a slower post-perceptual inference? Here we used magnetoencephalography (MEG) decoding to address this question. Subjects in the MEG viewed snapshots of visually matched real-world scenes containing a pair of people who were either engaged in a social interaction or acting independently. The presence versus absence of a social interaction could be read out from subjects’ MEG data spontaneously, even while subjects performed an orthogonal task. This readout generalized across different people and scenes, revealing abstract representations of social interactions in the human brain. These representations, however, did not come online until quite late, at 300 ms after image onset, well after feedforward visual processes. In a second experiment, we found that social interaction readout still occurred at this same late latency even when subjects performed an explicit task detecting social interactions. We further showed that MEG responses distinguished between different types of social interactions (mutual gaze vs joint attention) even later, around 500 ms after image onset. Taken together, these results suggest that the human brain spontaneously extracts information about others’ social interactions, but does so slowly, likely relying on iterative top-down computations.

%B NeuroImage %P 116844 %8 Jan-04-2020 %G eng %U https://www.ncbi.nlm.nih.gov/pubmed/32302763 %! NeuroImage %R 10.1016/j.neuroimage.2020.116844 %0 Journal Article %J Current Biology %D 2016 %T Neural Representations Integrate the Current Field of View with the Remembered 360° Panorama %A Robertson, Caroline E. %A Katherine Hermann %A Mynick, Anna %A Kravitz, Dwight J. %A Nancy Kanwisher %X

We experience our visual environment as a seamless, immersive panorama. Yet, each view is discrete and fleeting, separated by expansive eye movements and discontinuous views of our spatial surroundings. How are discrete views of a panoramic environment knit together into a broad, unified memory representation? Regions of the brain’s “scene network” are well poised to integrate retinal input and memory [ 1 ]: they are visually driven [ 2, 3 ] but also densely interconnected with memory structures in the medial temporal lobe [ 4 ]. Further, these regions harbor memory signals relevant for navigation [ 5–8 ] and adapt across overlapping shifts in scene viewpoint [ 9, 10 ]. However, it is unknown whether regions of the scene network support visual memory for the panoramic environment outside of the current field of view and, further, how memory for the surrounding environment influences ongoing perception. Here, we demonstrate that specific regions of the scene network—the retrosplenial complex (RSC) and occipital place area (OPA)—unite discrete views of a 360° panoramic environment, both current and out of sight, in a common representational space. Further, individual scene views prime associated representations of the panoramic environment in behavior, facilitating subsequent perceptual judgments. We propose that this dynamic interplay between memory and perception plays an important role in weaving the fabric of continuous visual experience.

%B Current Biology %8 09/08/2016 %G eng %U http://www.cell.com/current-biology/abstract/S0960-9822(16)30753-9 %R 10.1016/j.cub.2016.07.002