Project description:As one of the most important senses in human beings, touch can also help robots better perceive and adapt to complex environmental information, improving their autonomous decision-making and execution capabilities. Compared to other perception methods, tactile perception needs to handle multi-channel tactile signals simultaneously, such as pressure, bending, temperature, and humidity. However, directly transferring deep learning algorithms that work well on temporal signals to tactile signal tasks does not effectively utilize the physical spatial connectivity information of tactile sensors. In this paper, we propose a tactile perception framework based on graph attention networks, which incorporates explicit and latent relation graphs. This framework can effectively utilize the structural information between different tactile signal channels. We constructed a tactile glove and collected a dataset of pressure and bending tactile signals during grasping and holding objects, and our method achieved 89.58% accuracy in object tactile signal classification. Compared to existing time-series signal classification algorithms, our graph-based tactile perception algorithm can better utilize and learn sensor spatial information, making it more suitable for processing multi-channel tactile data. Our method can serve as a general strategy to improve a robot's tactile perception capabilities.
Project description:Fine surface texture is best discriminated by touch, in contrast to macro geometric features like shape. We used functional magnetic resonance imaging and a delayed match-to-sample task to investigate the neural substrate for working memory of tactile surface texture. Blindfolded right-handed males encoded the texture or location of up to four sandpaper stimuli using the dominant or non-dominant hand. They maintained the information for 10-12 s and then answered whether a probe stimulus matched the memory array. Analyses of variance with the factors Hand, Task, and Load were performed on the estimated percent signal change for the encoding and delay phase. During encoding, contralateral effects of Hand were found in sensorimotor regions, whereas Load effects were observed in bilateral postcentral sulcus (BA2), secondary somatosensory cortex (S2), pre-SMA, dorsolateral prefrontal cortex (dlPFC), and superior parietal lobule (SPL). During encoding and delay, Task effects (texture > location) were found in central sulcus, S2, pre-SMA, dlPFC, and SPL. The Task and Load effects found in hand- and modality-specific regions BA2 and S2 indicate involvement of these regions in the tactile encoding and maintenance of fine surface textures. Similar effects in hand- and modality-unspecific areas dlPFC, pre-SMA and SPL suggest that these regions contribute to the cognitive monitoring required to encode and maintain multiple items. Our findings stress both the particular importance of S2 for the encoding and maintenance of tactile surface texture, as well as the supramodal nature of parieto-frontal networks involved in cognitive control.
Project description:Motion is an essential component of everyday tactile experience: most manual interactions involve relative movement between the skin and objects. Much of the research on the neural basis of tactile motion perception has focused on how direction is encoded, but less is known about how speed is. Perceived speed has been shown to be dependent on surface texture, but previous studies used only coarse textures, which span a restricted range of tangible spatial scales and provide a limited window into tactile coding. To fill this gap, we measured the ability of human observers to report the speed of natural textures-which span the range of tactile experience and engage all the known mechanisms of texture coding-scanned across the skin. In parallel experiments, we recorded the responses of single units in the nerve and in the somatosensory cortex of primates to the same textures scanned at different speeds. We found that the perception of speed is heavily influenced by texture: some textures are systematically perceived as moving faster than are others, and some textures provide a more informative signal about speed than do others. Similarly, the responses of neurons in the nerve and in cortex are strongly dependent on texture. In the nerve, although all fibers exhibit speed-dependent responses, the responses of Pacinian corpuscle-associated (PC) fibers are most strongly modulated by speed and can best account for human judgments. In cortex, approximately half of the neurons exhibit speed-dependent responses, and this subpopulation receives strong input from PC fibers. However, speed judgments seem to reflect an integration of speed-dependent and speed-independent responses such that the latter help to partially compensate for the strong texture dependence of the former.
Project description:This dataset represents the 3D road surface texture of a route of 27 km long in a resolution of 0.1 m approximatively in both longitudinal and transversal directions. The first purpose of the dataset is to test numerical programs that need road surface topographies as input. The dataset is composed of 2658 text files and each representing a section of that route. The data was collected with the Harris2, a vehicle operated by TRL (Transport Research Laboratory from the UK), equipped with a LiDAR (Light Detection and Ranging) and a PPS (Road-Profile System). The files are presented in a regular grid achieved by merging the LIDAR and the PPS data into 3D coordinates.
Project description:Multifunctional flexible tactile sensors could be useful to improve the control of prosthetic hands. To that end, highly stretchable liquid metal tactile sensors (LMS) were designed, manufactured via photolithography, and incorporated into the fingertips of a prosthetic hand. Three novel contributions were made with the LMS. First, individual fingertips were used to distinguish between different speeds of sliding contact with different surfaces. Second, differences in surface textures were reliably detected during sliding contact. Third, the capacity for hierarchical tactile sensor integration was demonstrated by using four LMS signals simultaneously to distinguish between ten complex multi-textured surfaces. Four different machine learning algorithms were compared for their successful classification capabilities: K-nearest neighbor (KNN), support vector machine (SVM), random forest (RF), and neural network (NN). The time-frequency features of the LMSs were extracted to train and test the machine learning algorithms. The NN generally performed the best at the speed and texture detection with a single finger and had a 99.2 ± 0.8% accuracy to distinguish between ten different multi-textured surfaces using four LMSs from four fingers simultaneously. The capability for hierarchical multi-finger tactile sensation integration could be useful to provide a higher level of intelligence for artificial hands.
Project description:The sense of touch plays a fundamental role in enabling us to interact with our surrounding environment. Indeed, the presence of tactile feedback in prostheses greatly assists amputees in doing daily tasks. In this line, the present study proposes an integration of artificial tactile and proprioception receptors for texture discrimination under varying scanning speeds. Here, we fabricated a soft biomimetic fingertip including an 8 × 8 array tactile sensor and a piezoelectric sensor to mimic Merkel, Meissner, and Pacinian mechanoreceptors in glabrous skin, respectively. A hydro-elastomer sensor was fabricated as an artificial proprioception sensor (muscle spindles) to assess the instantaneous speed of the biomimetic fingertip. In this study, we investigated the concept of the complex receptive field of RA-I and SA-I afferents for naturalistic textures. Next, to evaluate the synergy between the mechanoreceptors and muscle spindle afferents, ten naturalistic textures were manipulated by a soft biomimetic fingertip at six different speeds. The sensors' outputs were converted into neuromorphic spike trains to mimic the firing pattern of biological mechanoreceptors. These spike responses are then analyzed using machine learning classifiers and neural coding paradigms to explore the multi-sensory integration in real experiments. This synergy between muscle spindle and mechanoreceptors in the proposed neuromorphic system represents a generalized texture discrimination scheme and interestingly irrespective of the scanning speed.
Project description:Autonomous dexterous manipulation relies on the ability to recognize an object and detect its slippage. Dynamic tactile signals are important for object recognition and slip detection. An object can be identified based on the acquired signals generated at contact points during tactile interaction. The use of vibrotactile sensors can increase the accuracy of texture recognition and preempt the slippage of a grasped object. In this work, we present a Deep Learning (DL) based method for the simultaneous texture recognition and slip detection. The method detects non-slip and slip events, the velocity, and discriminate textures-all within 17 ms. We evaluate the method for three objects grasped using an industrial gripper with accelerometers installed on its fingertips. A comparative analysis of convolutional neural networks (CNNs), feed-forward neural networks, and long short-term memory networks confirmed that deep CNNs have a higher generalization accuracy. We also evaluated the performance of the highest accuracy method for different signal bandwidths, which showed that a bandwidth of 125 Hz is enough to classify textures with 80% accuracy.
Project description:As robots are increasingly participating in our daily lives, the quests to mimic human abilities have driven the advancements of robotic multimodal senses. However, current perceptual technologies still unsatisfied robotic needs for home tasks/environments, particularly facing great challenges in multisensory integration and fusion, rapid response capability, and highly sensitive perception. Here, we report a flexible tactile sensor utilizing thin-film thermistors to implement multimodal perceptions of pressure, temperature, matter thermal property, texture, and slippage. Notably, the tactile sensor is endowed with an ultrasensitive (0.05 mm/s) and ultrafast (4 ms) slip sensing that is indispensable for dexterous and reliable grasping control to avoid crushing fragile objects or dropping slippery objects. We further propose and develop a robotic tactile-visual fusion architecture that seamlessly encompasses multimodal sensations from the bottom level to robotic decision-making at the top level. A series of intelligent grasping strategies with rapid slip feedback control and a tactile-visual fusion recognition strategy ensure dexterous robotic grasping and accurate recognition of daily objects, handling various challenging tasks, for instance grabbing a paper cup containing liquid. Furthermore, we showcase a robotic desktop-cleaning task, the robot autonomously accomplishes multi-item sorting and cleaning desktop, demonstrating its promising potential for smart housekeeping.
Project description:The aim of this work was to report on 7 patients presenting a distinctive form of multimodal (tactile and visual) hallucinations for which the term "string hallucinations" is proposed. Having observed a patient interacting with imaginary strips of skin in his hands at our movement disorders unit, we prospectively studied PD patients and caregivers over a 6-month period using a semistructured interview addressed to this particular phenomenon. Demographic characteristics as well as cognitive and motor function were assessed. A total of 7 of 164 PD patients (4.3%) observed during the study period had string hallucinations. One patient was cognitively intact and the other 6 had some degree of cognitive impairment. Common to the phenomenology of the hallucinations was the unpleasant feeling and vision of threads emerging from the subjects' hands. Patients interacted with these "threads," trying to remove them from their hands. Our study identifies a previously undescribed type of hallucinations in PD, characterized by a complex pattern of multimodal tactile and visual hallucinations.
Project description:Locating a tactile stimulus on the body seems effortless and straightforward. However, the perceived location of a tactile stimulation can differ from its physical location [1-3]. Tactile mislocalizations can depend on the timing of successive stimulations [2, 4, 5], tactile motion mechanisms [6], or processes that "remap" stimuli from skin locations to external space coordinates [7-11]. We report six experiments demonstrating that the perception of tactile localization on a static body part is strongly affected by the displacement between the locations of two successive task-irrelevant actions. Participants moved their index finger between two keys. Each keypress triggered synchronous tactile stimulation at a randomized location on the immobilized wrist or forehead. Participants reported the location of the second tactile stimulation relative to the first. The direction of either active finger movements or passive finger displacements biased participants' tactile orientation judgements (experiment 1). The effect generalized to tactile stimuli delivered to other body sites (experiment 2). Two successive keypresses, by different fingers at distinct locations, reproduced the effect (experiment 3). The effect remained even when the hand that moved was placed far from the tactile stimulation site (experiments 4 and 5). Temporal synchrony within 600 ms between the movement and tactile stimulations was necessary for the effect (experiment 6). Our results indicate that a dynamic displacement vector, defined as the location of one sensorimotor event relative to the one before, plays a strong role in structuring tactile spatial perception.