ABSTRACT: All motile organisms use spatially distributed chemical features of their surroundings to guide their behaviors, but the neural mechanisms underlying such behaviors in mammals have been difficult to study, largely due to the technical challenges of controlling chemical concentrations in space and time during behavioral experiments. To overcome these challenges, we introduce a system to control and maintain an olfactory virtual landscape. This system uses rapid flow controllers and an online predictive algorithm to deliver precise odorant distributions to head-fixed mice as they explore a virtual environment. We establish an odor-guided virtual navigation behavior that engages hippocampal CA1 "place cells" that exhibit similar properties to those previously reported for real and visual virtual environments, demonstrating that navigation based on different sensory modalities recruits a similar cognitive map. This method opens new possibilities for studying the neural mechanisms of olfactory-driven behaviors, multisensory integration, innate valence, and low-dimensional sensory-spatial processing.
Project description:Understanding of adaptive behavior requires the precisely controlled presentation of multisensory stimuli combined with simultaneous measurement of multiple behavioral modalities. Hence, we developed a virtual reality apparatus that allows for simultaneous measurement of reward checking, a commonly used measure in associative learning paradigms, and navigational behavior, along with precisely controlled presentation of visual, auditory and reward stimuli. Rats performed a virtual spatial navigation task analogous to the Morris maze where only distal visual or auditory cues provided spatial information. Spatial navigation and reward checking maps showed experience-dependent learning and were in register for distal visual cues. However, they showed a dissociation, whereby distal auditory cues failed to support spatial navigation but did support spatially localized reward checking. These findings indicate that rats can navigate in virtual space with only distal visual cues, without significant vestibular or other sensory inputs. Furthermore, they reveal the simultaneous dissociation between two reward-driven behaviors.
Project description:Sensory systems relay information about the world to the brain, which enacts behaviors through motor outputs. To maximize information transmission, sensory systems discard redundant information through adaptation to the mean and variance of the environment. The behavioral consequences of sensory adaptation to environmental variance have been largely unexplored. Here, we study how larval fruit flies adapt sensory-motor computations underlying navigation to changes in the variance of visual and olfactory inputs. We show that variance adaptation can be characterized by rescaling of the sensory input and that for both visual and olfactory inputs, the temporal dynamics of adaptation are consistent with optimal variance estimation. In multisensory contexts, larvae adapt independently to variance in each sense, and portions of the navigational pathway encoding mixed odor and light signals are also capable of variance adaptation. Our results suggest multiplication as a mechanism for odor-light integration.
Project description:Odor attraction in walking Drosophila melanogaster is commonly used to relate neural function to behavior, but the algorithms underlying attraction are unclear. Here, we develop a high-throughput assay to measure olfactory behavior in response to well-controlled sensory stimuli. We show that odor evokes two behaviors: an upwind run during odor (ON response), and a local search at odor offset (OFF response). Wind orientation requires antennal mechanoreceptors, but search is driven solely by odor. Using dynamic odor stimuli, we measure the dependence of these two behaviors on odor intensity and history. Based on these data, we develop a navigation model that recapitulates the behavior of flies in our apparatus, and generates realistic trajectories when run in a turbulent boundary layer plume. The ability to parse olfactory navigation into quantifiable elementary sensori-motor transformations provides a foundation for dissecting neural circuits that govern olfactory behavior.
Project description:Accurately encoding time is one of the fundamental challenges faced by the nervous system in mediating behavior. We recently reported that some animals have a specialized population of rhythmically active neurons in their olfactory organs with the potential to peripherally encode temporal information about odor encounters. If these neurons do indeed encode the timing of odor arrivals, it should be possible to demonstrate that this capacity has some functional significance. Here we show how this sensory input can profoundly influence an animal's ability to locate the source of odor cues in realistic turbulent environments-a common task faced by species that rely on olfactory cues for navigation. Using detailed data from a turbulent plume created in the laboratory, we reconstruct the spatiotemporal behavior of a real odor field. We use recurrence theory to show that information about position relative to the source of the odor plume is embedded in the timing between odor pulses. Then, using a parameterized computational model, we show how an animal can use populations of rhythmically active neurons to capture and encode this temporal information in real time, and use it to efficiently navigate to an odor source. Our results demonstrate that the capacity to accurately encode temporal information about sensory cues may be crucial for efficient olfactory navigation. More generally, our results suggest a mechanism for extracting and encoding temporal information from the sensory environment that could have broad utility for neural information processing.
Project description:The human capacity to integrate sensory signals has been investigated with respect to different sensory modalities. A common denominator of the neural network underlying the integration of sensory clues has yet to be identified. Additionally, brain imaging data from patients with autism spectrum disorder (ASD) do not cover disparities in neuronal sensory processing. In this fMRI study, we compared the underlying neural networks of both olfactory-visual and auditory-visual integration in patients with ASD and a group of matched healthy participants. The aim was to disentangle sensory-specific networks so as to derive a potential (amodal) common source of multisensory integration (MSI) and to investigate differences in brain networks with sensory processing in individuals with ASD. In both groups, similar neural networks were found to be involved in the olfactory-visual and auditory-visual integration processes, including the primary visual cortex, the inferior parietal sulcus (IPS), and the medial and inferior frontal cortices. Amygdala activation was observed specifically during olfactory-visual integration, with superior temporal activation having been seen during auditory-visual integration. A dynamic causal modeling analysis revealed a nonlinear top-down IPS modulation of the connection between the respective primary sensory regions in both experimental conditions and in both groups. Thus, we demonstrate that MSI has shared neural sources across olfactory-visual and audio-visual stimulation in patients and controls. The enhanced recruitment of the IPS to modulate changes between areas is relevant to sensory perception. Our results also indicate that, with respect to MSI processing, adults with ASD do not significantly differ from their healthy counterparts.
Project description:The central complex in the insect brain is a composite of midline neuropils involved in processing sensory cues and mediating behavioral outputs to orchestrate spatial navigation. Despite recent advances, however, the neural mechanisms underlying sensory integration and motor action selections have remained largely elusive. In particular, it is not yet understood how the central complex exploits sensory inputs to realize motor functions associated with spatial navigation. Here we report an in silico interrogation of central complex-mediated spatial navigation with a special emphasis on the ellipsoid body. Based on known connectivity and function, we developed a computational model to test how the local connectome of the central complex can mediate sensorimotor integration to guide different forms of behavioral outputs. Our simulations show integration of multiple sensory sources can be effectively performed in the ellipsoid body. This processed information is used to trigger continuous sequences of action selections resulting in self-motion, obstacle avoidance and the navigation of simulated environments of varying complexity. The motor responses to perceived sensory stimuli can be stored in the neural structure of the central complex to simulate navigation relying on a collective of guidance cues, akin to sensory-driven innate or habitual behaviors. By comparing behaviors under different conditions of accessible sources of input information, we show the simulated insect computes visual inputs and body posture to estimate its position in space. Finally, we tested whether the local connectome of the central complex might also allow the flexibility required to recall an intentional behavioral sequence, among different courses of actions. Our simulations suggest that the central complex can encode combined representations of motor and spatial information to pursue a goal and thus successfully guide orientation behavior. Together, the observed computational features identify central complex circuitry, and especially the ellipsoid body, as a key neural correlate involved in spatial navigation.
Project description:Multisensory interactions are essential to make sense of the environment by transforming the mosaic of sensory inputs received by the organism into a unified perception. Brain rhythms allow coherent processing within areas or between distant brain regions and could thus be instrumental in functionally connecting remote brain areas in the context of multisensory interactions. Still, odor and sound processing relate to two sensory systems with specific anatomofunctional characteristics. How does the brain handle their association? Rats were challenged to discriminate between unisensory stimulation (odor or sound) and the multisensory combination of both. During learning, we observed a progressive establishment of high power beta oscillations (15-35 Hz) spanning on the olfactory bulb, the piriform cortex and the perirhinal cortex, but not the primary auditory cortex. In the piriform cortex, beta oscillations power was higher in the multisensory condition compared to the presentation of the odor alone. Furthermore, in the olfactory structures, the sound alone was able to elicit a beta oscillatory response. These findings emphasize the functional differences between olfactory and auditory cortices and reveal that beta oscillations contribute to the memory formation of the multisensory association.
Project description:Tools enabling closed-loop experiments are crucial to delineate causal relationships between the activity of genetically labeled neurons and specific behaviors. We developed the Raspberry Pi Virtual Reality (PiVR) system to conduct closed-loop optogenetic stimulation of neural functions in unrestrained animals. PiVR is an experimental platform that operates at high temporal resolution (70 Hz) with low latencies (<30 milliseconds), while being affordable (<US$500) and easy to build (<6 hours). Through extensive documentation, this tool was designed to be accessible to a wide public, from high school students to professional researchers studying systems neuroscience. We illustrate the functionality of PiVR by focusing on sensory navigation in response to gradients of chemicals (chemotaxis) and light (phototaxis). We show how Drosophila adult flies perform negative chemotaxis by modulating their locomotor speed to avoid locations associated with optogenetically evoked bitter taste. In Drosophila larvae, we use innate positive chemotaxis to compare behavior elicited by real- and virtual-odor gradients. Finally, we examine how positive phototaxis emerges in zebrafish larvae from the modulation of turning maneuvers to orient in virtual white-light gradients. Besides its application to study chemotaxis and phototaxis, PiVR is a versatile tool designed to bolster efforts to map and to functionally characterize neural circuits.
Project description:During migratory journeys, birds may become displaced from their normal migratory route. Experimental evidence has shown that adult birds can correct for such displacements and return to their goal. However, the nature of the cues used by migratory birds to perform long distance navigation is still debated. In this experiment we subjected adult lesser black-backed gulls migrating from their Finnish/Russian breeding grounds (from >60°N) to Africa (to < 5°N) to sensory manipulation, to determine the sensory systems required for navigation. We translocated birds westward (1080 km) or eastward (885 km) to simulate natural navigational challenges. When translocated westwards and outside their migratory corridor birds with olfactory nerve section kept a clear directional preference (southerly) but were unable to compensate for the displacement, while intact birds and gulls with the ophthalmic branch of the trigeminal nerve sectioned oriented towards their population-specific migratory corridor. Thus, air-borne olfactory information seems to be important for migrating gulls to navigate successfully in some circumstances.
Project description:Although a standard reinforcement learning model can capture many aspects of reward-seeking behaviors, it may not be practical for modeling human natural behaviors because of the richness of dynamic environments and limitations in cognitive resources. We propose a modular reinforcement learning model that addresses these factors. Based on this model, a modular inverse reinforcement learning algorithm is developed to estimate both the rewards and discount factors from human behavioral data, which allows predictions of human navigation behaviors in virtual reality with high accuracy across different subjects and with different tasks. Complex human navigation trajectories in novel environments can be reproduced by an artificial agent that is based on the modular model. This model provides a strategy for estimating the subjective value of actions and how they influence sensory-motor decisions in natural behavior.