Language regions of brain are operative in color perception.
ABSTRACT: The effect of language on the categorical perception of color is stronger for stimuli in the right visual field (RVF) than in the left visual field, but the neural correlates of the behavioral RVF advantage are unknown. Here we present brain activation maps revealing how language is differentially engaged in the discrimination of colored stimuli presented in either visual hemifield. In a rapid, event-related functional MRI study, we measured subjects' brain activity while they performed a visual search task. Compared with colors from the same lexical category, discrimination of colors from different linguistic categories provoked stronger and faster responses in the left hemisphere language regions, particularly when the colors were presented in the RVF. In addition, activation of visual areas 2/3, responsible for color perception, was much stronger for RVF stimuli from different linguistic categories than for stimuli from the same linguistic category. Notably, the enhanced activity of visual areas 2/3 coincided with the enhanced activity of the left posterior temporoparietal language region, suggesting that this language region may serve as a top-down control source that modulates the activation of the visual cortex. These findings shed light on the brain mechanisms that underlie the hemifield- dependent effect of language on visual perception.
Project description:Grapheme-color synesthesia, the idiosyncratic, arbitrary association of colors to letters or numbers, develops in childhood once reading is mastered. Because language processing is strongly left-lateralized in most individuals, we hypothesized that grapheme-color synesthesia could be left-lateralized as well. We used synesthetic versions of the Stroop test with colored letters and numbers presented either in the right or the left visual field of thirty-four synesthetes. Interference by synesthetic colors was stronger for stimuli in the right hemifield (first experiment, color naming task). Synesthetes were also faster in the right hemifield when naming the synesthetic color of graphemes (second experiment). Overall, the lateralization effect was 7 ms (the 95% confidence interval was [1.5 12] ms), a delay compatible with an additional callosal transfer for stimuli presented in the left hemifield. Though weak, this effect suggests that the association of synesthetic colors to graphemes may be preferentially processed in the left hemisphere. We speculate that this left-lateralization could be a landmark of synesthetic grapheme-color associations, if not found for color associations learnt by non-synesthete adults.
Project description:Well over half a century ago, Benjamin Lee Whorf [Carroll JB (1956) Language, Thought, and Reality: Selected Writings of Benjamin Lee Whorf (MIT Press, Cambridge, MA)] proposed that language affects perception and thought and is used to segment nature, a hypothesis that has since been tested by linguistic and behavioral studies. Although clear Whorfian effects have been found, it has not yet been demonstrated that language influences brain activity associated with perception and/or immediate postperceptual processes (referred hereafter as "perceptual decision"). Here, by using functional magnetic resonance imaging, we show that brain regions mediating language processes participate in neural networks activated by perceptual decision. When subjects performed a perceptual discrimination task on easy-to-name and hard-to-name colored squares, largely overlapping cortical regions were identified, which included areas of the occipital cortex critical for color vision and regions in the bilateral frontal gyrus. Crucially, however, in comparison with hard-to-name colored squares, perceptual discrimination of easy-to-name colors evoked stronger activation in the left posterior superior temporal gyrus and inferior parietal lobule, two regions responsible for word-finding processes, as demonstrated by a localizer experiment that uses an explicit color patch naming task. This finding suggests that the language-processing areas of the brain are directly involved in visual perceptual decision, thus providing neuroimaging support for the Whorf hypothesis.
Project description:Previous studies have shown that language can modulate visual perception, by biasing and/or enhancing perceptual performance. However, it is still debated where in the brain visual and linguistic information are integrated, and whether the effects of language on perception are automatic and persist even in the absence of awareness of the linguistic material. Here, we aimed to explore the automaticity of language-perception interactions and the neural loci of these interactions in an fMRI study. Participants engaged in a visual motion discrimination task (upward or downward moving dots). Before each trial, a word prime was briefly presented that implied upward or downward motion (e.g., "rise", "fall"). These word primes strongly influenced behavior: congruent motion words sped up reaction times and improved performance relative to incongruent motion words. Neural congruency effects were only observed in the left middle temporal gyrus, showing higher activity for congruent compared to incongruent conditions. This suggests that higher-level conceptual areas rather than sensory areas are the locus of language-perception interactions. When motion words were rendered unaware by means of masking, they still affected visual motion perception, suggesting that language-perception interactions may rely on automatic feed-forward integration of perceptual and semantic material in language areas of the brain.
Project description:The human brain integrates hemifield-split visual information via interhemispheric transfer. The degree to which neural circuits involved in this process behave differently during word recognition as compared to object recognition is not known. Evidence from neuroimaging (fMRI) suggests that interhemispheric transfer during word viewing converges in the left hemisphere, in two distinct brain areas, an "occipital word form area" (OWFA) and a more anterior occipitotemporal "visual word form area" (VWFA). We used a novel fMRI half-field repetition technique to test whether or not these areas also integrate nonverbal hemifield-split string stimuli of similar visual complexity. We found that the fMRI responses of both the OWFA and VWFA while viewing nonverbal stimuli were strikingly different than those measured during word viewing, especially with respect to half-stimulus changes restricted to a single hemifield. We conclude that normal reading relies on left-lateralized neural mechanisms, which integrate hemifield-split visual information for words but not for nonverbal stimuli.
Project description:Verbal stimuli often induce right-hemispheric activation in patients with aphasia after left-hemispheric stroke. This right-hemispheric activation is commonly attributed to functional reorganization within the language system. Yet previous evidence suggests that functional activation in right-hemispheric homologues of classic left-hemispheric language areas may partly be due to processing nonlinguistic perceptual features of verbal stimuli. We used functional MRI (fMRI) to clarify the role of the right hemisphere in the perception of nonlinguistic word features in healthy individuals. Participants made perceptual, semantic, or phonological decisions on the same set of auditorily and visually presented word stimuli. Perceptual decisions required judgements about stimulus-inherent changes in font size (visual modality) or fundamental frequency contour (auditory modality). The semantic judgement required subjects to decide whether a stimulus is natural or man-made; the phonologic decision required a decision on whether a stimulus contains two or three syllables. Compared to phonologic or semantic decision, nonlinguistic perceptual decisions resulted in a stronger right-hemispheric activation. Specifically, the right inferior frontal gyrus (IFG), an area previously suggested to support language recovery after left-hemispheric stroke, displayed modality-independent activation during perceptual processing of word stimuli. Our findings indicate that activation of the right hemisphere during language tasks may, in some instances, be driven by a "nonlinguistic perceptual processing" mode that focuses on nonlinguistic word features. This raises the possibility that stronger activation of right inferior frontal areas during language tasks in aphasic patients with left-hemispheric stroke may at least partially reflect increased attentional focus on nonlinguistic perceptual aspects of language.
Project description:Patients with striate cortex damage and clinical blindness retain the ability to process certain visual properties of stimuli that they are not aware of seeing. Here we investigated the neural correlates of residual visual perception for dynamic whole-body emotional actions. Angry and neutral emotional whole-body actions were presented in the intact and blind visual hemifield of a cortically blind patient with unilateral destruction of striate cortex. Comparisons of angry vs. neutral actions performed separately in the blind and intact visual hemifield showed in both cases increased activation in primary somatosensory, motor, and premotor cortices. Activations selective for intact hemifield presentation of angry compared with neutral actions were located subcortically in the right lateral geniculate nucleus and cortically in the superior temporal sulcus, prefrontal cortex, precuneus, and intraparietal sulcus. Activations specific for blind hemifield presentation of angry compared with neutral actions were found in the bilateral superior colliculus, pulvinar nucleus of the thalamus, amygdala, and right fusiform gyrus. Direct comparison of emotional modulation in the blind vs. intact visual hemifield revealed selective activity in the right superior colliculus and bilateral pulvinar for angry expressions, thereby showing a selective involvement of these subcortical structures in nonconscious visual emotion perception.
Project description:Recent findings have re-examined the linguistic influence on cognition and perception, while identifying evidence that supports the Whorfian hypothesis. We examine how English and Japanese speakers perceive similarity of pairs of objects, by using two sets of stimuli: one in which two distinct linguistic categories apply to respective object images in English, but only one linguistic category applies in Japanese; and another in which two distinct linguistic categories apply to respective object images in Japanese, but only one applies in English. We conducted four studies and tested different groups of participants in each of them. In Study 1, we asked participants to name the two objects before engaging in the similarity judgment task. Here, we expected a strong linguistic effect. In Study 2, we asked participants to engage in the same task without naming, where we assumed that the condition is close enough to our daily visual information processing where language is not necessarily prompted. We further explored whether the language still influences the similarity perception by asking participants to engage in the same task basing on the visual similarity (Study 3) and the functional similarity (Study 4). The results overall indicated that English and Japanese speakers perceived the two objects to be more similar when they were in the same linguistic categories than when they were in different linguistic categories in their respective languages. Implications for research testing the Whorfian hypothesis and the requirement for methodological development beyond behavioral measures are discussed.
Project description:BACKGROUND: Visual neglect is an attentional deficit typically resulting from parietal cortex lesion and sometimes frontal lesion. Patients fail to attend to objects and events in the visual hemifield contralateral to their lesion during visual search. METHODOLOGY/PRINCIPAL FINDING: The aim of this work was to examine the effects of parietal and frontal lesion in an existing computational model of visual attention and search and simulate visual search behaviour under lesion conditions. We find that unilateral parietal lesion in this model leads to symptoms of visual neglect in simulated search scan paths, including an inhibition of return (IOR) deficit, while frontal lesion leads to milder neglect and to more severe deficits in IOR and perseveration in the scan path. During simulations of search under unilateral parietal lesion, the model's extrastriate ventral stream area exhibits lower activity for stimuli in the neglected hemifield compared to that for stimuli in the normally perceived hemifield. This could represent a computational correlate of differences observed in neuroimaging for unconscious versus conscious perception following parietal lesion. CONCLUSIONS/SIGNIFICANCE: Our results lead to the prediction, supported by effective connectivity evidence, that connections between the dorsal and ventral visual streams may be an important factor in the explanation of perceptual deficits in parietal lesion patients and of conscious perception in general.
Project description:Since perceptual and neural face sensitivity is associated with a foveal bias, and neural place sensitivity is associated with a peripheral bias (integration over space), we hypothesized that face perception ability will decline more with eccentricity than place perception ability. We also wanted to examine whether face perception ability would show a left visual field (LeVF) bias due to earlier reports suggesting right hemisphere dominance for faces, or would show an upper or lower visual field bias. Participants performed foveal and parafoveal face and house discrimination tasks for upright or inverted stimuli (?4°) while their eye movements were monitored. Low-level visual tasks were also measured. The eccentricity-related accuracy reductions were evident for all categories. Through detailed analyses we found (i) a robust face inversion effect across the parafovea, while for houses an opposite effect was found, (ii) higher eccentricity-related sensitivity for face performance than for house performance (via inverted vs. upright within-category eccentricity-driven reductions), (iii) within-category but not across-category performance associations across eccentricities, and (iv) no hemifield biases. Our central to parafoveal investigations suggest that high-level vision processing may be reflected in behavioural performance.
Project description:Unilateral damage to the primary visual cortex (V1) leads to clinical blindness in the opposite visual hemifield, yet nonconscious ability to transform unseen visual input into motor output can be retained, a condition known as "blindsight." Here we combined psychophysics, functional magnetic resonance imaging, and tractography to investigate the functional and structural properties that enable the developing brain to partly overcome the effects of early V1 lesion in one blindsight patient. Visual stimuli appeared in either the intact or blind hemifield and simple responses were given with either the left or right hand, thereby creating conditions where visual input and motor output involve the same or opposite hemisphere. When the V1-damaged hemisphere was challenged by incoming visual stimuli, or controlled manual responses to these unseen stimuli, the corpus callosum (CC) dynamically recruited areas in the visual dorsal stream and premotor cortex of the intact hemisphere to compensate for altered visuomotor functions. These compensatory changes in functional brain activity were paralleled by increased connections in posterior regions of the CC, where fibers connecting homologous areas of the parietal cortex course.