Accumulation of Inertial Sensory Information in the Perception of Whole Body Yaw Rotation.
ABSTRACT: While moving through the environment, our central nervous system accumulates sensory information over time to provide an estimate of our self-motion, allowing for completing crucial tasks such as maintaining balance. However, little is known on how the duration of the motion stimuli influences our performances in a self-motion discrimination task. Here we study the human ability to discriminate intensities of sinusoidal (0.5 Hz) self-rotations around the vertical axis (yaw) for four different stimulus durations (1, 2, 3 and 5 s) in darkness. In a typical trial, participants experienced two consecutive rotations of equal duration and different peak amplitude, and reported the one perceived as stronger. For each stimulus duration, we determined the smallest detectable change in stimulus intensity (differential threshold) for a reference velocity of 15 deg/s. Results indicate that differential thresholds decrease with stimulus duration and asymptotically converge to a constant, positive value. This suggests that the central nervous system accumulates sensory information on self-motion over time, resulting in improved discrimination performances. Observed trends in differential thresholds are consistent with predictions based on a drift diffusion model with leaky integration of sensory evidence.
Project description:Perceptual learning, the ability to improve the sensitivity of sensory perception through training, has been shown to exist in all sensory systems but the vestibular system. A previous study found no improvement of passive self-motion thresholds in the dark after intense direction discrimination training of either yaw rotations (stimulating semicircular canals) or y-translation (stimulating otoliths). The goal of the present study was to investigate whether perceptual learning of self-motion in the dark would occur when there is a simultaneous otolith and semicircular canal input, as is the case with roll tilt motion stimuli. Blindfolded subjects (n = 10) trained on a direction discrimination task with 0.2-Hz roll tilt motion stimuli (9 h of training, 1,800 trials). Before and after training, motion thresholds were measured in the dark for the trained motion and for three transfer conditions. We found that roll tilt sensitivity in the 0.2-Hz roll tilt condition was increased (i.e., thresholds decreased) after training but not for controls who were not exposed to training. This is the first demonstration of perceptual learning of passive self-motion direction discrimination in the dark. The results have potential therapeutic relevance as 0.2-Hz roll thresholds have been associated with poor performance on a clinical balance test that has been linked to more than a fivefold increase in falls.
Project description:A central function of sensory systems is the gathering of information about dynamic interactions with the environment during self-motion. To determine whether modulation of a sensory cue was externally caused or a result of self-motion is fundamental to perceptual invariance and requires the continuous update of sensory processing about recent movements. This process is highly context-dependent and crucial for perceptual performances such as decision-making and sensory object formation. Yet despite its fundamental ecological role, voluntary self-motion is rarely incorporated in perceptual or neurophysiological investigations of sensory processing in animals. Here, we present the Sensory Island Task (SIT), a new freely moving search paradigm to study sensory processing and perception. In SIT, animals explore an open-field arena to find a sensory target relying solely on changes in the presented stimulus, which is controlled by closed-loop position tracking in real-time. Within a few sessions, animals are trained via positive reinforcement to search for a particular area in the arena (“target island”), which triggers the presentation of the target stimulus. The location of the target island is randomized across trials, making the modulated stimulus feature the only informative cue for task completion. Animals report detection of the target stimulus by remaining within the island for a defined time (“sit-time”). Multiple “non-target” islands can be incorporated to test psychometric discrimination and identification performance. We exemplify the suitability of SIT for rodents (Mongolian gerbil, Meriones unguiculatus) and small primates (mouse lemur, Microcebus murinus) and for studying various sensory perceptual performances (auditory frequency discrimination, sound source localization, visual orientation discrimination). Furthermore, we show that pairing SIT with chronic electrophysiological recordings allows revealing neuronal signatures of sensory processing under ecologically relevant conditions during goal-oriented behavior. In conclusion, SIT represents a flexible and easily implementable behavioral paradigm for mammals that combines self-motion and natural exploratory behavior to study sensory sensitivity and decision-making and their underlying neuronal processing.
Project description:There is accumulating evidence that the brain's neural coding strategies are constrained by natural stimulus statistics. Here we investigated the statistics of the time varying envelope (i.e. a second-order stimulus attribute that is related to variance) of rotational and translational self-motion signals experienced by human subjects during everyday activities. We found that envelopes can reach large values across all six motion dimensions (~450 deg/s for rotations and ~4 G for translations). Unlike results obtained in other sensory modalities, the spectral power of envelope signals decreased slowly for low (< 2 Hz) and more sharply for high (>2 Hz) temporal frequencies and thus was not well-fit by a power law. We next compared the spectral properties of envelope signals resulting from active and passive self-motion, as well as those resulting from signals obtained when the subject is absent (i.e. external stimuli). Our data suggest that different mechanisms underlie deviation from scale invariance in rotational and translational self-motion envelopes. Specifically, active self-motion and filtering by the human body cause deviation from scale invariance primarily for translational and rotational envelope signals, respectively. Finally, we used well-established models in order to predict the responses of peripheral vestibular afferents to natural envelope stimuli. We found that irregular afferents responded more strongly to envelopes than their regular counterparts. Our findings have important consequences for understanding the coding strategies used by the vestibular system to process natural second-order self-motion signals.
Project description:Brainstem and cerebellar neurons implement an internal model to accurately estimate self-motion during externally generated ('passive') movements. However, these neurons show reduced responses during self-generated ('active') movements, indicating that predicted sensory consequences of motor commands cancel sensory signals. Remarkably, the computational processes underlying sensory prediction during active motion and their relationship to internal model computations during passive movements remain unknown. We construct a Kalman filter that incorporates motor commands into a previously established model of optimal passive self-motion estimation. The simulated sensory error and feedback signals match experimentally measured neuronal responses during active and passive head and trunk rotations and translations. We conclude that a single sensory internal model can combine motor commands with vestibular and proprioceptive signals optimally. Thus, although neurons carrying sensory prediction error or feedback signals show attenuated modulation, the sensory cues and internal model are both engaged and critically important for accurate self-motion estimation during active head movements.
Project description:Vestibular information about self-motion is combined with other sensory signals. Previous research described both visuo-vestibular and vestibular-tactile bilateral interactions, but the simultaneous interaction between all three sensory modalities has not been explored. Here we exploit a previously reported visuo-vestibular integration to investigate multisensory effects on tactile sensitivity in humans. Tactile sensitivity was measured during passive whole body rotations alone or in conjunction with optic flow, creating either purely vestibular or visuo-vestibular sensations of self-motion. Our results demonstrate that tactile sensitivity is modulated by perceived self-motion, as provided by a combined visuo-vestibular percept, and not by the visual and vestibular cues independently. We propose a hierarchical multisensory interaction that underpins somatosensory modulation: visual and vestibular cues are first combined to produce a multisensory self-motion percept. Somatosensory processing is then enhanced according to the degree of perceived self-motion.
Project description:A visual stimulus rotating globally along an observer's line of sight can induce the illusory perception of self-rotation in the opposite direction (roll vection). Psychophysical experiments were conducted to examine the effects of local rotations of visual elements of the stimulus that were manipulated independently of the global rotation. The results indicated that the addition of local rotations inconsistent with the global rotation (assumed to be the primary inducer of roll vection), generally decreased the strength of perceived self-rotation. The uniformity of orientation of the elements composing the global visual pattern and the visual polarities assigned to each visual element, i.e., intrinsic directionality concerning up and down, were observed to function as modulators of the effects of the local rotation. These results suggested that local motion signals arising from independent rotations assigned to each element of a visual object cannot be ignored in the perceptual mechanism underlying roll vection.
Project description:The accurate representation of self-motion requires the efficient processing of sensory input by the vestibular system. Conventional wisdom is that vestibular information is exclusively transmitted through changes in firing rate, yet under this assumption vestibular neurons display relatively poor detection and information transmission. Here, we carry out an analysis of the system's coding capabilities by recording neuronal responses to repeated presentations of naturalistic stimuli. We find that afferents with greater intrinsic variability reliably discriminate between different stimulus waveforms through differential patterns of precise (?6?ms) spike timing, while those with minimal intrinsic variability do not. A simple mathematical model provides an explanation for this result. Postsynaptic central neurons also demonstrate precise spike timing, suggesting that higher brain areas also represent self-motion using temporally precise firing. These findings demonstrate that two distinct sensory channels represent vestibular information: one using rate coding and the other that takes advantage of precise spike timing.
Project description:The neural mechanism underlying simple perceptual decision-making in monkeys has been recently conceptualized as an integrative process in which sensory evidence supporting different response options accumulates gradually over time. For example, intraparietal neurons accumulate motion information in favor of a specific oculomotor choice over time. It is unclear, however, whether this mechanism generalizes to more complex decisions that are based on arbitrary stimulus-response associations. In a task requiring arbitrary association of visual stimuli (faces or places) with different actions (eye or hand-pointing movements), we found that activity of effector-specific regions in human posterior parietal cortex reflected the 'strength' of the sensory evidence in favor of the preferred response. These regions did not respond to sensory stimuli per se but integrated sensory evidence toward the decision outcome. We conclude that even arbitrary decisions can be mediated by sensory-motor mechanisms that are completely triggered by contextual stimulus-response associations.
Project description:The impairment to discriminate the motion direction of a large high contrast stimulus or to detect a stimulus surrounded by another one is called visual suppression and is the result of the normal function of our visual inhibitory mechanisms. Recently, Melnick et al. (2013), using a motion discrimination task, showed that intelligence strongly correlates with visual suppression (r = 0.71). Cook et al. (2016) also showed a strong link between contrast surround suppression and IQ (r = 0.87), this time using a contrast matching task. Our aim is to test this link using two different visual suppression tasks: a motion discrimination task and a contrast detection task. Fifty volunteers took part in the experiments. Using Bayesian staircases, we measured duration thresholds in the motion experiment and contrast thresholds in the spatial experiment. Although we found a much weaker effect, our results from the motion experiment still replicate previous results supporting the link between motion surround suppression and IQ (r = 0.43). However, our results from the spatial experiment do not support the link between contrast surround suppression and IQ (r = -0.09). Methodological differences between this study and previous studies which could explain these discrepancies are discussed.
Project description:A subject-specific process of perceptual decision making is of importance to how the brain translates its interpretation of sensory information into behavior. In particular, a number of studies reported substantial variation across the observers' decision behavior, which may reflect different profiles of evidence accumulated by each individual. However, a detailed profile of perceptual integration has not yet been verified from human behavioral data. To address the issue, we precisely measured the time course of sensory integration, as the "sensory integration kernel" of subjects, using a coherence-varying motion discrimination task. We found that each subject has a distinct profile of sensory integration. We observed that kernel size (maximum sensory integration interval) is consistent within subjects, independent of external stimuli conditions. The observed kernel could accurately predict subject-specific perceptual behaviors and explain the inter-individual variation of observed behaviors. Surprisingly, the performance of most subjects did not improve in proportion to increased duration of the stimulus, but was maximized when the stimulus duration matched their kernel size. We also found that the observed kernel size was strongly correlated with the subject-specific perceptual characteristics for illusory motion. Our results suggest that perceptual decisions arise from intrinsic decision dynamics, and on individual timescales of sensory integration.