A gravity-based three-dimensional compass in the mouse brain.
ABSTRACT: Gravity sensing provides a robust verticality signal for three-dimensional navigation. Head direction cells in the mammalian limbic system implement an allocentric neuronal compass. Here we show that head-direction cells in the rodent thalamus, retrosplenial cortex and cingulum fiber bundle are tuned to conjunctive combinations of azimuth and tilt, i.e. pitch or roll. Pitch and roll orientation tuning is anchored to gravity and independent of visual landmarks. When the head tilts, azimuth tuning is affixed to the head-horizontal plane, but also uses gravity to remain anchored to the allocentric bearings in the earth-horizontal plane. Collectively, these results demonstrate that a three-dimensional, gravity-based, neural compass is likely a ubiquitous property of mammalian species, including ground-dwelling animals.
Project description:Head direction (HD) cells fire when the animal faces that cell's preferred firing direction (PFD) in the horizontal plane. The PFD response when the animal is oriented outside the earth-horizontal plane could result from cells representing direction in the plane of locomotion or as a three-dimensional (3D), global-referenced direction anchored to gravity. To investigate these possibilities, anterodorsal thalamic HD cells were recorded from restrained rats while they were passively positioned in various 3D orientations. Cell responses were unaffected by pitch or roll up to ~90° from the horizontal plane. Firing was disrupted once the animal was oriented >90° away from the horizontal plane and during inversion. When rolling the animal around the earth-vertical axis, cells were active when the animal's ventral surface faced the cell's PFD. However, with the rat rolled 90° in an ear-down orientation, pitching the rat and rotating it around the vertical axis did not produce directionally tuned responses. Complex movements involving combinations of yaw-roll, but usually not yaw-pitch, resulted in reduced directional tuning even at the final upright orientation when the rat had full visual view of its environment and was pointing in the cell's PFD. Directional firing was restored when the rat's head was moved back-and-forth. There was limited evidence indicating that cells contained conjunctive firing with pitch or roll positions. These findings suggest that the brain's representation of directional heading is derived primarily from horizontal canal information and that the HD signal is a 3D gravity-referenced signal anchored to a direction in the horizontal plane. NEW & NOTEWORTHY This study monitored head direction cell responses from rats in three dimensions using a series of manipulations that involved yaw, pitch, roll, or a combination of these rotations. Results showed that head direction responses are consistent with the use of two reference frames simultaneously: one defined by the surrounding environment using primarily visual landmarks and a second defined by the earth's gravity vector.
Project description:Gravity may provide a ubiquitous allocentric reference to the brain's spatial orientation circuits. Here we describe neurons in the macaque anterior thalamus tuned to pitch and roll orientation relative to gravity, independently of visual landmarks. We show that individual cells exhibit two-dimensional tuning curves, with peak firing rates at a preferred vertical orientation. These results identify a thalamic pathway for gravity cues to influence perception, action and spatial cognition.
Project description:The three-dimensional vestibulo-ocular reflex (3D VOR) ideally generates compensatory ocular rotations not only with a magnitude equal and opposite to the head rotation but also about an axis that is collinear with the head rotation axis. Vestibulo-ocular responses only partially fulfill this ideal behavior. Because animal studies have shown that vestibular stimulation about particular axes may lead to suboptimal compensatory responses, we investigated in healthy subjects the peaks and troughs in 3D VOR stabilization in terms of gain and alignment of the 3D vestibulo-ocular response. Six healthy upright sitting subjects underwent whole body small amplitude sinusoidal and constant acceleration transients delivered by a six-degree-of-freedom motion platform. Subjects were oscillated about the vertical axis and about axes in the horizontal plane varying between roll and pitch at increments of 22.5 degrees in azimuth. Transients were delivered in yaw, roll, and pitch and in the vertical canal planes. Eye movements were recorded in with 3D search coils. Eye coil signals were converted to rotation vectors, from which we calculated gain and misalignment. During horizontal axis stimulation, systematic deviations were found. In the light, misalignment of the 3D VOR had a maximum misalignment at about 45 degrees . These deviations in misalignment can be explained by vector summation of the eye rotation components with a low gain for torsion and high gain for vertical. In the dark and in response to transients, gain of all components had lower values. Misalignment in darkness and for transients had different peaks and troughs than in the light: its minimum was during pitch axis stimulation and its maximum during roll axis stimulation. We show that the relatively large misalignment for roll in darkness is due to a horizontal eye movement component that is only present in darkness. In combination with the relatively low torsion gain, this horizontal component has a relative large effect on the alignment of the eye rotation axis with respect to the head rotation axis.
Project description:Head direction cells are critical for navigation because they convey information about which direction an animal is facing within an environment. To date, most studies on head direction encoding have been conducted on a horizontal two-dimensional (2D) plane, and little is known about how three-dimensional (3D) direction information is encoded in the brain despite humans and other animals living in a 3D world. Here, we investigated head direction encoding in the human brain while participants moved within a virtual 3D "spaceship" environment. Movement was not constrained to planes and instead participants could move along all three axes in volumetric space as if in zero gravity. Using functional magnetic resonance imaging (fMRI) multivoxel pattern similarity analysis, we found evidence that the thalamus, particularly the anterior portion, and the subiculum encoded the horizontal component of 3D head direction (azimuth). In contrast, the retrosplenial cortex was significantly more sensitive to the vertical direction (pitch) than to the azimuth. Our results also indicated that vertical direction information in the retrosplenial cortex was significantly correlated with behavioral performance during a direction judgment task. Our findings represent the first evidence showing that the "classic" head direction system that has been identified on a horizontal 2D plane also seems to encode vertical and horizontal heading in 3D space in the human brain.
Project description:Both eye position and head orientation are influenced by the macular (otolith) organs, via the tilt maculo-ocular reflex (tiltMOR) and the vestibulo-collic reflexes, respectively. The mechanisms that control head position also influence the rest position of the eye because head orientation influences eye position through the tiltMOR. Despite the increasing popularity of mice for studies of vestibular and ocular motor functions, relatively little is known in this species about tiltMOR, spontaneous orientation of the head, and their interrelationship. We used 2D video oculography to determine in C57BL/6 mice the absolute horizontal and vertical positions of the eyes over body orientations spanning 360 degrees about the pitch and roll axes. We also determined head pitch during ambulation in the same animals. Eye elevation varied approximately sinusoidally as functions of pitch or roll angle. Over the central +/-30 degrees of pitch, sensitivity and gain in the light were 31.7 degrees/g and 0.53, respectively. The corresponding values for roll were 31.5 degrees/g and 0.52. Absolute positions adopted in light and darkness differed only slightly. During ambulation, mice carried the lambda-bregma plane at a downward pitch of 29 degrees , corresponding to a horizontal eye position of 64 degrees and a vertical eye position of 22 degrees . The vertical position is near the center of the range of eye movements produced by the pitch tiltMOR. The results indicate that the tiltMOR is robust in this species and favor standardizing pitch orientation across laboratories. The robust tiltMOR also has significant methodological implications for the practice of pupil-tracking video oculography in this species.
Project description:A key function of the brain is to provide a stable representation of an object's location in the world. In hearing, sound azimuth and elevation are encoded by neurons throughout the auditory system, and auditory cortex is necessary for sound localization. However, the coordinate frame in which neurons represent sound space remains undefined: classical spatial receptive fields in head-fixed subjects can be explained either by sensitivity to sound source location relative to the head (egocentric) or relative to the world (allocentric encoding). This coordinate frame ambiguity can be resolved by studying freely moving subjects; here we recorded spatial receptive fields in the auditory cortex of freely moving ferrets. We found that most spatially tuned neurons represented sound source location relative to the head across changes in head position and direction. In addition, we also recorded a small number of neurons in which sound location was represented in a world-centered coordinate frame. We used measurements of spatial tuning across changes in head position and direction to explore the influence of sound source distance and speed of head movement on auditory cortical activity and spatial tuning. Modulation depth of spatial tuning increased with distance for egocentric but not allocentric units, whereas, for both populations, modulation was stronger at faster movement speeds. Our findings suggest that early auditory cortex primarily represents sound source location relative to ourselves but that a minority of cells can represent sound location in the world independent of our own position.
Project description:Head-direction cells preferentially discharge when the head points in a particular azimuthal direction, are hypothesized to collectively function as a single neural system for a unitary direction sense, and are believed to be essential for navigating extra-personal space by functioning like a compass. We tested these ideas by recording medial entorhinal cortex (MEC) head-direction cells while rats navigated on a familiar, continuously rotating disk that dissociates the environment into two spatial frames: one stationary and one rotating. Head-direction cells degraded directional tuning referenced to either of the externally referenced spatial frames, but firing rates, sub-second cell-pair action potential discharge relationships, and internally referenced directional tuning were preserved. MEC head-direction cell ensemble discharge collectively generates a subjective, internally referenced unitary representation of direction that, unlike a compass, is inconsistently registered to external landmarks during navigation. These findings indicate that MEC-based directional information is subjectively anchored, potentially providing for navigation without a stable externally anchored direction sense.
Project description:Properly constructed stereoscopic images are aligned vertically on the display screen, so on-screen binocular disparities are strictly horizontal. If the viewer's inter-ocular axis is also horizontal, he/she makes horizontal vergence eye movements to fuse the stereoscopic image. However, if the viewer's head is rolled to the side, the on-screen disparities now have horizontal and vertical components at the eyes. Thus, the viewer must make horizontal and vertical vergence movements to binocularly fuse the two images. Vertical vergence movements occur naturally, but they are usually quite small. Much larger movements are required when viewing stereoscopic images with the head rotated to the side. We asked whether the vertical vergence eye movements required to fuse stereoscopic images when the head is rolled cause visual discomfort. We also asked whether the ability to see stereoscopic depth is compromised with head roll. To answer these questions, we conducted behavioral experiments in which we simulated head roll by rotating the stereo display clockwise or counter-clockwise while the viewer's head remained upright relative to gravity. While viewing the stimulus, subjects performed a psychophysical task. Visual discomfort increased significantly with the amount of stimulus roll and with the magnitude of on-screen horizontal disparity. The ability to perceive stereoscopic depth also declined with increasing roll and on-screen disparity. The magnitude of both effects was proportional to the magnitude of the induced vertical disparity. We conclude that head roll is a significant cause of viewer discomfort and that it also adversely affects the perception of depth from stereoscopic displays.
Project description:This study explored the mechanical factors that determine accuracy of a baseball pitching. In particular, we focused on the mechanical parameters at ball release, referred to as release parameters. The aim was to understand which parameter has the most deterministic influence on pitch location by measuring the release parameters during actual pitching and developing a simulation that predicts the pitch location from given release parameters. By comparing the fluctuation of the simulated pitch location when varying each release parameter, it was found that the elevation pitching angle and speed significantly influenced the vertical pitch location, and the azimuth pitching angle significantly influenced the horizontal pitch location. Moreover, a regression model was obtained to predict the pitch location, and it became clear that the significant predictors for the vertical pitch location were the elevation pitching angle, the speed, and spin axis, and those for the horizontal pitch location were the azimuth pitching angle, the spin axis, and horizontal release point. Therefore, it was suggested that the parameter most affecting pitch location weas pitching angle. On the other hand, multiple regression analyses revealed that the relation between release parameters varied between pitchers. The result is expected to contribute to an understanding of the mechanisms underlying accurate ball control skill in baseball pitching.
Project description:What is the natural reference frame for seeing large-scale spatial scenes in locomotor action space? Prior studies indicate an asymmetric angular expansion in perceived direction in large-scale environments: Angular elevation relative to the horizon is perceptually exaggerated by a factor of 1.5, whereas azimuthal direction is exaggerated by a factor of about 1.25. Here participants made angular and spatial judgments when upright or on their sides to dissociate egocentric from allocentric reference frames. In Experiment 1, it was found that body orientation did not affect the magnitude of the up-down exaggeration of direction, suggesting that the relevant orientation reference frame for this directional bias is allocentric rather than egocentric. In Experiment 2, the comparison of large-scale horizontal and vertical extents was somewhat affected by viewer orientation, but only to the extent necessitated by the classic (5%) horizontal-vertical illusion (HVI) that is known to be retinotopic. Large-scale vertical extents continued to appear much larger than horizontal ground extents when observers lay sideways. When the visual world was reoriented in Experiment 3, the bias remained tied to the ground-based allocentric reference frame. The allocentric HVI is quantitatively consistent with differential angular exaggerations previously measured for elevation and azimuth in locomotor space. (PsycINFO Database Record