Varying Timescales of Stimulus Integration Unite Neural Adaptation and Prototype Formation.
ABSTRACT: Human visual perception is both stable and adaptive. Perception of complex objects, such as faces, is shaped by the long-term average of experience as well as immediate, comparative context. Measurements of brain activity have demonstrated corresponding neural mechanisms, including norm-based responses reflective of stored prototype representations, and adaptation induced by the immediately preceding stimulus. Here, we consider the possibility that these apparently separate phenomena can arise from a single mechanism of sensory integration operating over varying timescales. We used fMRI to measure neural responses from the fusiform gyrus while subjects observed a rapid stream of face stimuli. Neural activity at this cortical site was best explained by the integration of sensory experience over multiple sequential stimuli, following a decaying-exponential weighting function. Although this neural activity could be mistaken for immediate neural adaptation or long-term, norm-based responses, it in fact reflected a timescale of integration intermediate to both. We then examined the timescale of sensory integration across the cortex. We found a gradient that ranged from rapid sensory integration in early visual areas, to long-term, stable representations in higher-level, ventral-temporal cortex. These findings were replicated with a new set of face stimuli and subjects. Our results suggest that a cascade of visual areas integrate sensory experience, transforming highly adaptable responses at early stages to stable representations at higher levels.
Project description:Brains are optimized for processing ethologically relevant sensory signals. However, few studies have characterized the neural coding mechanisms that underlie the transformation from natural sensory information to behavior. Here, we focus on acoustic communication in Drosophila melanogaster and use computational modeling to link natural courtship song, neuronal codes, and female behavioral responses to song. We show that melanogaster females are sensitive to long timescale song structure (on the order of tens of seconds). From intracellular recordings, we generate models that recapitulate neural responses to acoustic stimuli. We link these neural codes with female behavior by generating model neural responses to natural courtship song. Using a simple decoder, we predict female behavioral responses to the same song stimuli with high accuracy. Our modeling approach reveals how long timescale song features are represented by the Drosophila brain and how neural representations can be decoded to generate behavioral selectivity for acoustic communication signals.
Project description:Emotion is thought to cause focal enhancement or distortion of certain components of memory, indicating a complex property of emotional modulation on memory rather than simple enhancement. However, the neural basis for detailed modulation of emotional memory contents has remained unclear. Here has been shown that the information processing of the prefrontal cortex differentially affects sensory representations during experience of emotional information compared with neutral information, using functional magnetic resonance imaging (fMRI). It was found that during perception of emotional pictures, information representation in primary visual cortex (V1) significantly corresponded with the representations in dorsolateral prefrontal cortex (dlPFC). This correspondence was not observed for neutral pictures. Furthermore, participants with greater correspondence between visual and prefrontal representations showed better memory for high-level semantic components but not for low-level visual components of emotional stimuli. These results suggest that sensory representation during experience of emotional stimuli, compared with neutral stimuli, is more directly influenced by internally generated higher-order information from the prefrontal cortex.
Project description:Learned associations between stimuli in different sensory modalities can shape the way we perceive these stimuli. However, it is not well understood how these interactions are mediated or at what level of the processing hierarchy they occur. Here we describe a neural mechanism by which an auditory input can shape visual representations of behaviorally relevant stimuli through direct interactions between auditory and visual cortices in mice. We show that the association of an auditory stimulus with a visual stimulus in a behaviorally relevant context leads to experience-dependent suppression of visual responses in primary visual cortex (V1). Auditory cortex axons carry a mixture of auditory and retinotopically matched visual input to V1, and optogenetic stimulation of these axons selectively suppresses V1 neurons that are responsive to the associated visual stimulus after, but not before, learning. Our results suggest that cross-modal associations can be communicated by long-range cortical connections and that, with learning, these cross-modal connections function to suppress responses to predictable input.
Project description:Vision often requires attending to, and integrating information from, distant parts of the visual field. However, the neural basis for such long-range integration is not clearly understood. Here, we demonstrate a specific neural signature of attentional integration between stimuli in different parts of the visual field. Using functional MRI, we found that a task requiring the integration of information between two attended but spatially separated stimuli actively modulated the degree of functional integration (in terms of effective connectivity) between their retinotopic representations in visual cortical areas V1, V2, and V4. Spatial attention enhanced long-distance coupling between distinct neuronal populations that represented the attended visual stimuli, even at the earliest stages of cortical processing. In contrast, unattended stimulus representations were decoupled both from attended representations and particularly strongly from each other. Furthermore, enhanced functional integration between cortical representations was associated with enhanced behavioral performance. Attention may thus serve to "bind" together cortical loci at multiple levels of the visual hierarchy that are commonly involved in processing attended stimuli, promoting integration between otherwise functionally isolated cortical loci.
Project description:A growing body of literature has demonstrated that primary sensory cortices are not exclusively unimodal, but can respond to stimuli of different sensory modalities. However, several questions concerning the neural representation of cross-modal stimuli remain open. Indeed, it is poorly understood if cross-modal stimuli evoke unique or overlapping representations in a primary sensory cortex and whether learning can modulate these representations. Here we recorded single unit responses to auditory, visual, somatosensory, and olfactory stimuli in the gustatory cortex (GC) of alert rats before and after associative learning. We found that, in untrained rats, the majority of GC neurons were modulated by a single modality. Upon learning, both prevalence of cross-modal responsive neurons and their breadth of tuning increased, leading to a greater overlap of representations. Altogether, our results show that the gustatory cortex represents cross-modal stimuli according to their sensory identity, and that learning changes the overlap of cross-modal representations.
Project description:Attention enhances the neural representations of behaviorally relevant stimuli, typically by a push-pull increase of the neuronal response gain to attended vs. unattended stimuli. This selectively improves perception and consequently behavioral performance. However, to enhance the detectability of stimulus changes, attention might also distort neural representations, compromising accurate stimulus representation. We test this hypothesis by recording neural responses in the visual cortex of rhesus monkeys during a motion direction change detection task. We find that attention indeed amplifies the neural representation of direction changes, beyond a similar effect of adaptation. We further show that humans overestimate such direction changes, providing a perceptual correlate of our neurophysiological observations. Our results demonstrate that attention distorts the neural representations of abrupt sensory changes and consequently perceptual accuracy. This likely represents an evolutionary adaptive mechanism that allows sensory systems to flexibly forgo accurate representation of stimulus features to improve the encoding of stimulus change.
Project description:When a behaviorally relevant stimulus has been previously associated with reward, behavioral responses are faster and more accurate compared to equally relevant but less valuable stimuli. Conversely, task-irrelevant stimuli that were previously associated with a high reward can capture attention and distract processing away from relevant stimuli (e.g., seeing a chocolate bar in the pantry when you are looking for a nice, healthy apple). Although increasing the value of task-relevant stimuli systematically up-regulates neural responses in early visual cortex to facilitate information processing, it is not clear whether the value of task-irrelevant distractors influences behavior via competition in early visual cortex or via competition at later stages of decision-making and response selection. Here, we measured functional magnetic resonance imaging (fMRI) in human visual cortex while subjects performed a value-based learning task, and we applied a multivariate inverted encoding model (IEM) to assess the fidelity of distractor representations in early visual cortex. We found that the fidelity of neural representations related to task-irrelevant distractors increased when the distractors were previously associated with a high reward. This finding suggests that value-driven attentional capture begins with sensory modulations of distractor representations in early areas of visual cortex.
Project description:Color is special among basic visual features in that it can form a defining part of objects that are engrained in our memory. Whereas most neuroimaging research on human color vision has focused on responses related to external stimulation, the present study investigated how sensory-driven color vision is linked to subjective color perception induced by object imagery. We recorded fMRI activity in male and female volunteers during viewing of abstract color stimuli that were red, green, or yellow in half of the runs. In the other half we asked them to produce mental images of colored, meaningful objects (such as tomato, grapes, banana) corresponding to the same three color categories. Although physically presented color could be decoded from all retinotopically mapped visual areas, only hV4 allowed predicting colors of imagined objects when classifiers were trained on responses to physical colors. Importantly, only neural signal in hV4 was predictive of behavioral performance in the color judgment task on a trial-by-trial basis. The commonality between neural representations of sensory-driven and imagined object color and the behavioral link to neural representations in hV4 identifies area hV4 as a perceptual hub linking externally triggered color vision with color in self-generated object imagery.SIGNIFICANCE STATEMENT Humans experience color not only when visually exploring the outside world, but also in the absence of visual input, for example when remembering, dreaming, and during imagery. It is not known where neural codes for sensory-driven and internally generated hue converge. In the current study we evoked matching subjective color percepts, one driven by physically presented color stimuli, the other by internally generated color imagery. This allowed us to identify area hV4 as the only site where neural codes of corresponding subjective color perception converged regardless of its origin. Color codes in hV4 also predicted behavioral performance in an imagery task, suggesting it forms a perceptual hub for color perception.
Project description:A central goal in the study of any sensory system is to predict neural responses to complex inputs, especially those encountered during natural stimulation. Nowhere is the transformation from stimulus to response better understood than the vertebrate retina. Nevertheless, descriptions of retinal computation are largely based on stimulation using artificial visual stimuli, and it is unclear how these descriptions map onto the encoding of natural stimuli. We demonstrate that nonlinear spatial integration, a common feature of retinal ganglion cell (RGC) processing, shapes neural responses to natural visual stimuli in primate Off parasol RGCs, whereas On parasol RGCs exhibit surprisingly linear spatial integration. Despite this asymmetry, both cell types show strong nonlinear integration when presented with artificial stimuli. We show that nonlinear integration of natural stimuli is a consequence of rectified excitatory synaptic input and that accounting for nonlinear spatial integration substantially improves models that predict RGC responses to natural images.
Project description:The brain is not a passive sensory-motor analyzer driven by environmental stimuli, but actively maintains ongoing representations that may be involved in the coding of expected sensory stimuli, prospective motor responses, and prior experience. Spontaneous cortical activity has been proposed to play an important part in maintaining these ongoing, internal representations, although its functional role is not well understood. One spontaneous signal being intensely investigated in the human brain is the interregional temporal correlation of the blood-oxygen level-dependent (BOLD) signal recorded at rest by functional MRI (functional connectivity-by-MRI, fcMRI, or BOLD connectivity). This signal is intrinsic and coherent within a number of distributed networks whose topography closely resembles that of functional networks recruited during tasks. While it is apparent that fcMRI networks reflect anatomical connectivity, it is less clear whether they have any dynamic functional importance. Here, we demonstrate that visual perceptual learning, an example of adult neural plasticity, modifies the resting covariance structure of spontaneous activity between networks engaged by the task. Specifically, after intense training on a shape-identification task constrained to one visual quadrant, resting BOLD functional connectivity and directed mutual interaction between trained visual cortex and frontal-parietal areas involved in the control of spatial attention were significantly modified. Critically, these changes correlated with the degree of perceptual learning. We conclude that functional connectivity serves a dynamic role in brain function, supporting the consolidation of previous experience.