Learning to use an invisible visual signal for perception.
ABSTRACT: How does the brain construct a percept from sensory signals? One approach to this fundamental question is to investigate perceptual learning as induced by exposure to statistical regularities in sensory signals [1-7]. Recent studies showed that exposure to novel correlations between sensory signals can cause a signal to have new perceptual effects [2, 3]. In those studies, however, the signals were clearly visible. The automaticity of the learning was therefore difficult to determine. Here we investigate whether learning of this sort, which causes new effects on appearance, can be low level and automatic by employing a visual signal whose perceptual consequences were made invisible-a vertical disparity gradient masked by other depth cues. This approach excluded high-level influences such as attention or consciousness. Our stimulus for probing perceptual appearance was a rotating cylinder. During exposure, we introduced a new contingency between the invisible signal and the rotation direction of the cylinder. When subsequently presenting an ambiguously rotating version of the cylinder, we found that the invisible signal influenced the perceived rotation direction. This demonstrates that perception can rapidly undergo "structure learning" by automatically picking up novel contingencies between sensory signals, thus automatically recruiting signals for novel uses during the construction of a percept.
Project description:Slowing of the rate at which a rivalrous percept switches from one configuration to another has been suggested as a potential trait marker for bipolar disorder. We measured perceptual alternations for a bistable, rotating, structure-from-motion cylinder in bipolar and control participants. In a control task, binocular depth rendered the direction of cylinder rotation unambiguous to monitor participants' performance and attention during the experimental task. A particular direction of rotation was perceptually stable, on average, for 33.5s in participants without psychiatric diagnosis. Euthymic, bipolar participants showed a slightly slower rate of switching between the two percepts (percept duration 42.3s). Under a parametric analysis of the best-fitting model for individual participants, this difference was statistically significant. However, the variability within groups was high, so this difference in average switch rates was not big enough to serve as a trait marker for bipolar disorder. We also found that low-level visual capacities, such as stereo threshold, influence perceptual switch rates. We suggest that there is no single brain location responsible for perceptual switching in all different ambiguous figures and that perceptual switching is generated by the actions of local cortical circuitry.
Project description:Although challenging, adults can learn non-native phonetic contrasts with extensive training [1, 2], indicative of perceptual learning beyond an early sensitivity period [3, 4]. Training can alter low-level sensory encoding of newly acquired speech sound patterns ; however, the time-course, behavioral relevance, and long-term retention of such sensory plasticity is unclear. Some theories argue that sensory plasticity underlying signal enhancement is immediate and critical to perceptual learning [6, 7]. Others, like the reverse hierarchy theory (RHT), posit a slower time-course for sensory plasticity . RHT proposes that higher-level categorical representations guide immediate, novice learning, while lower-level sensory changes do not emerge until expert stages of learning . We trained 20 English-speaking adults to categorize a non-native phonetic contrast (Mandarin lexical tones) using a criterion-dependent sound-to-category training paradigm. Sensory and perceptual indices were assayed across operationally defined learning phases (novice, experienced, over-trained, and 8-week retention) by measuring the frequency-following response, a neurophonic potential that reflects fidelity of sensory encoding, and the perceptual identification of a tone continuum. Our results demonstrate that while robust changes in sensory encoding and perceptual identification of Mandarin tones emerged with training and were retained, such changes followed different timescales. Sensory changes were evidenced and related to behavioral performance only when participants were over-trained. In contrast, changes in perceptual identification reflecting improvement in categorical percept emerged relatively earlier. Individual differences in perceptual identification, and not sensory encoding, related to faster learning. Our findings support the RHT-sensory plasticity accompanies, rather than drives, expert levels of non-native speech learning.
Project description:Certain visual stimuli can give rise to contradictory perceptions. In this paper we examine the temporal dynamics of perceptual reversals experienced with biological motion, comparing these dynamics to those observed with other ambiguous structure from motion (SFM) stimuli. In our first experiment, naïve observers monitored perceptual alternations with an ambiguous rotating walker, a figure that randomly alternates between walking in clockwise (CW) and counter-clockwise (CCW) directions. While the number of reported reversals varied between observers, the observed dynamics (distribution of dominance durations, CW/CCW proportions) were comparable to those experienced with an ambiguous kinetic depth cylinder. In a second experiment, we compared reversal profiles with rotating and standard point-light walkers (i.e. non-rotating). Over multiple test repetitions, three out of four observers experienced consistently shorter mean percept durations with the rotating walker, suggesting that the added rotational component may speed up reversal rates with biomotion. For both stimuli, the drift in alternation rate across trial and across repetition was minimal. In our final experiment, we investigated whether reversals with the rotating walker and a non-biological object with similar global dimensions (rotating cuboid) occur at random phases of the rotation cycle. We found evidence that some observers experience peaks in the distribution of response locations that are relatively stable across sessions. Using control data, we discuss the role of eye movements in the development of these reversal patterns, and the related role of exogenous stimulus characteristics. In summary, we have demonstrated that the temporal dynamics of reversal with biological motion are similar to other forms of ambiguous SFM. We conclude that perceptual switching with biological motion is a robust bistable phenomenon.
Project description:The visual system can learn to use information in new ways to construct appearance. Thus, signals such as the location or translation direction of an ambiguously rotating wire frame cube, which are normally uninformative, can be learned as cues to determine the rotation direction. This perceptual learning occurs when the formerly uninformative signal is statistically associated with long-trusted visual cues (such as binocular disparity) that disambiguate appearance during training. In previous demonstrations, the newly learned cue was intrinsic to the perceived object, in that the signal was conveyed by the same image elements as the object itself. Here we used extrinsic new signals and observed no learning. We correlated three new signals with long-trusted cues in the rotating cube paradigm: one crossmodal (an auditory signal) and two within modality (visual). Cue recruitment did not occur in any of these conditions, either in single sessions or in ten sessions across as many days. These results suggest that the intrinsic/extrinsic distinction is important for the perceptual system in determining whether it can learn and use new information from the environment to construct appearance. Extrinsic cues do have perceptual effects (e.g. the "bounce-pass" illusion and McGurk effect), so we speculate that extrinsic signals must be recruited for perception, but only if certain conditions are met. These conditions might specify the age of the observer, the strength of the long-trusted cues, or the amount of exposure to the correlation.
Project description:An input (e.g., airplane takeoff sound) to a sensory modality can suppress the percept of another input (e.g., talking voices of neighbors) of the same modality. This perceptual suppression effect is evidence that neural responses to different inputs closely interact with each other in the brain. While recent studies suggest that close interactions also occur across sensory modalities, crossmodal perceptual suppression effect has not yet been reported. Here, we demonstrate that tactile stimulation can suppress the percept of visual stimuli: Visual orientation discrimination performance was degraded when a tactile vibration was applied to the observer's index finger of hands. We also demonstrated that this tactile suppression effect on visual perception occurred primarily when the tactile and visual information were spatially and temporally consistent. The current findings would indicate that neural signals could closely and directly interact with each other, sufficient to induce the perceptual suppression effect, even across sensory modalities.
Project description:To form a more reliable percept of the environment, the brain needs to estimate its own sensory uncertainty. Current theories of perceptual inference assume that the brain computes sensory uncertainty instantaneously and independently for each stimulus. We evaluated this assumption in four psychophysical experiments, in which human observers localized auditory signals that were presented synchronously with spatially disparate visual signals. Critically, the visual noise changed dynamically over time continuously or with intermittent jumps. Our results show that observers integrate audiovisual inputs weighted by sensory uncertainty estimates that combine information from past and current signals consistent with an optimal Bayesian learner that can be approximated by exponential discounting. Our results challenge leading models of perceptual inference where sensory uncertainty estimates depend only on the current stimulus. They demonstrate that the brain capitalizes on the temporal dynamics of the external world and estimates sensory uncertainty by combining past experiences with new incoming sensory signals.
Project description:Although motor actions can profoundly affect the perceptual interpretation of sensory inputs, it is not known whether the combination of sensory and movement signals occurs only for sensory surfaces undergoing movement or whether it is a more general phenomenon. In the haptic modality, the independent movement of multiple sensory surfaces poses a challenge to the nervous system when combining the tactile and kinesthetic signals into a coherent percept. When exploring a stationary object, the tactile and kinesthetic signals come from the same hand. Here we probe the internal structure of haptic combination by directing the two signal streams to separate hands: one hand moves but receives no tactile stimulation, while the other hand feels the consequences of the first hand's movement but remains still. We find that both discrete and continuous tactile and kinesthetic signals are combined as if they came from the same hand. This combination proceeds by direct coupling or transfer of the kinesthetic signal from the moving to the feeling hand, rather than assuming the displacement of a mediating object. The combination of signals is due to perception rather than inference, because a small temporal offset between the signals significantly degrades performance. These results suggest that the brain simplifies the complex coordinate transformation task of remapping sensory inputs to take into account the movements of multiple body parts in haptic perception, and they show that the effects of action are not limited to moving sensors.
Project description:Task Irrelevant Perceptual Learning (TIPL) shows that the brain's discriminative capacity can improve also for invisible and unattended visual stimuli. It has been hypothesized that this form of "unconscious" neural plasticity is mediated by an endogenous reward mechanism triggered by the correct task performance. Although this result has challenged the mandatory role of attention in perceptual learning, no direct evidence exists of the hypothesized link between target recognition, reward and TIPL. Here, we manipulated the reward value associated with a target to demonstrate the involvement of reinforcement mechanisms in sensory plasticity for invisible inputs. Participants were trained in a central task associated with either high or low monetary incentives, provided only at the end of the experiment, while subliminal stimuli were presented peripherally. Our results showed that high incentive-value targets induced a greater degree of perceptual improvement for the subliminal stimuli, supporting the role of reinforcement mechanisms in TIPL.
Project description:The pupil is primarily regulated by prevailing light levels but is also modulated by perceptual and attentional factors. We measured pupil-size in typical adult humans viewing a bistable-rotating cylinder, constructed so the luminance of the front surface changes with perceived direction of rotation. In some participants, pupil diameter oscillated in phase with the ambiguous perception, more dilated when the black surface was in front. Importantly, the magnitude of oscillation predicts autistic traits of participants, assessed by the Autism-Spectrum Quotient AQ. Further experiments suggest that these results are driven by differences in perceptual styles: high AQ participants focus on the front surface of the rotating cylinder, while those with low AQ distribute attention to both surfaces in a more global, holistic style. This is the first evidence that pupillometry reliably tracks inter-individual differences in perceptual styles; it does so quickly and objectively, without interfering with spontaneous perceptual strategies.
Project description:Perception is an inferential process, which becomes immediately evident when sensory information is conflicting or ambiguous and thus allows for more than one perceptual interpretation. Thinking the idea of perception as inference through to the end results in a blurring of boundaries between perception and action selection, as perceptual inference implies the construction of a percept as an active process. Here we therefore wondered whether perception shares a key characteristic of action selection, namely that it is shaped by reinforcement learning. In two behavioral experiments, we used binocular rivalry to examine whether perceptual inference can be influenced by the association of perceptual outcomes with reward or punishment, respectively, in analogy to instrumental conditioning. Binocular rivalry was evoked by two orthogonal grating stimuli presented to the two eyes, resulting in perceptual alternations between the two gratings. Perception was tracked indirectly and objectively through a target detection task, which allowed us to preclude potential reporting biases. Monetary reward or punishments were given repeatedly during perception of only one of the two rivaling stimuli. We found an increase in dominance durations for the percept associated with reward, relative to the non-rewarded percept. In contrast, punishment led to an increase of the non-punished compared to a relative decrease of the punished percept. Our results show that perception shares key characteristics with action selection, in that it is influenced by reward and punishment in opposite directions, thus narrowing the gap between the conceptually separated domains of perception and action selection. We conclude that perceptual inference is an adaptive process that is shaped by its consequences.