Project description:Recent years have witnessed relevant advancements in the quality of life of persons with lower limb amputations thanks to the technological developments in prosthetics. However, prostheses that provide information about the foot-ground interaction, and in particular about terrain irregularities, are still missing on the market. The lack of tactile feedback from the foot sole might lead subjects to step on uneven terrains, causing an increase in the risk of falling. To address this issue, a biomimetic vibrotactile feedback system that conveys information about gait and terrain features sensed by a dedicated insole has been assessed with intact subjects. After having shortly experienced both even and uneven terrains, the recruited subjects discriminated them with an accuracy of 87.5%, solely relying on the replay of the vibrotactile feedback. With the objective of exploring the human decoding mechanism of the feedback startegy, a KNN classifier was trained to recognize the uneven terrains. The outcome suggested that the subjects achieved such performance with a temporal dynamics of 45 ms. This work is a leap forward to assist lower-limb amputees to appreciate the floor conditions while walking, adapt their gait and promote a more confident use of their artificial limb.
Project description:UNLABELLED: The outcome of arthroscopic procedures is related to the surgeon's skills in arthroscopy. Currently, evaluation of such skills relies on direct observation by a surgeon trainer. This type of assessment, by its nature, is subjective and time-consuming. The aim of our study was to identify whether haptic information generated from arthroscopic tools could distinguish between skilled and less skilled surgeons. A standard arthroscopic probe was fitted with a force/torque sensor. The probe was used by five surgeons with different levels of experience in knee arthroscopy performing 11 different tasks in 10 standard knee arthroscopies. The force/torque data from the hand and tool interface were recorded and synchronized with a video recording of the procedure. The torque magnitude and patterns generated were analyzed and compared. A computerized system was used to analyze the force/torque signature based on general principles for quality of performance using such measures as economy in movement, time efficiency, and consistency in performance. The results showed a considerable correlation between three haptic parameters and the surgeon's experience, which could be used in an automated objective assessment system for arthroscopic surgery. LEVEL OF EVIDENCE: Level II, diagnostic study. See the Guidelines for Authors for a complete description of levels of evidence.
Project description:This study aims to explore a feasible form of a haptic device for common users. We propose HAPmini, a novel graspable haptic device that enhances the user's touch interaction. To achieve this enhancement, the HAPmini is designed with low mechanical complexity, few actuators, and a simple structure, while still providing force and tactile feedback to users. Despite having a single solenoid-magnet actuator and a simple structure, the HAPmini can provide haptic feedback corresponding to a user's 2-dimensional touch interaction. Based on the force and tactile feedback, the hardware magnetic snap function and virtual texture were developed. The hardware magnetic snap function helped users perform pointing tasks by applying an external force to their fingers to enhance their touch interaction performance. The virtual texture simulated the surface texture of a specific material through vibration and delivered a haptic sensation to users. In this study, five virtual textures (i.e., reproductions of the textures of paper, jean, wood, sandpaper, and cardboard) were designed for HAPmini. Both HAPmini functions were tested in three experiments. First, a comparative experiment was conducted, and it was confirmed that the hardware magnetic snap function could increase the performance of pointing tasks to the same extent as the software magnetic snap function could, which is commonly used in graphical tools. Second, ABX and matching tests were conducted to determine whether HAPmini could generate the five virtual textures, which were designed differently and sufficiently well for the participants to be distinguished from each other. The correctness rates of the ABX and the matching tests were 97.3% and 93.3%, respectively. The results confirmed that the participants could distinguish the virtual textures generated using HAPmini. The experiments indicate that HAPmini enhances the usability of touch interaction (hardware magnetic snap function) and also provides additional texture information that was previously unavailable on the touchscreen (virtual texture).
Project description:PurposeColonoscopy is one of the most effective diagnostic and therapeutic tools for colorectal diseases. We aim to propose a master-slave robotic colonoscopy that is controllable in remote site using conventional colonoscopy.Materials and methodsThe master and slave robot were developed to use conventional flexible colonoscopy. The robotic colonoscopic procedure was performed using a colonoscope training model by one expert endoscopist and two unexperienced engineers. To provide the haptic sensation, the insertion force and the rotating torque were measured and sent to the master robot.ResultsA slave robot was developed to hold the colonoscopy and its knob, and perform insertion, rotation, and two tilting motions of colonoscope. A master robot was designed to teach motions of the slave robot. These measured force and torque were scaled down by one tenth to provide the operator with some reflection force and torque at the haptic device. The haptic sensation and feedback system was successful and helpful to feel the constrained force or torque in colon. The insertion time using robotic system decreased with repeated procedures.ConclusionThis work proposed a robotic approach for colonoscopy using haptic feedback algorithm, and this robotic device would effectively perform colonoscopy with reduced burden and comparable safety for patients in remote site.
Project description:Loss of tactile sensations is a major roadblock preventing upper limb-absent people from multitasking or using the full dexterity of their prosthetic hands. With current myoelectric prosthetic hands, limb-absent people can only control one grasp function at a time even though modern artificial hands are mechanically capable of individual control of all five digits. In this paper, we investigated whether people could precisely control the grip forces applied to two different objects grasped simultaneously with a dexterous artificial hand. Toward that end, we developed a novel multichannel wearable soft robotic armband to convey artificial sensations of touch to the robotic hand users. Multiple channels of haptic feedback enabled subjects to successfully grasp and transport two objects simultaneously with the dexterous artificial hand without breaking or dropping them, even when their vision of both objects was obstructed. Simultaneous transport of the objects provided a significant time savings to perform the deliveries in comparison to a one-at-a-time approach. This paper demonstrated that subjects were able to integrate multiple channels of haptic feedback into their motor control strategies to perform a complex simultaneous object grasp control task with an artificial limb, which could serve as a paradigm shift in the way prosthetic hands are operated.
Project description:To test and evaluate the second installment of DENTIFY, a virtual reality haptic simulator for Operative Dentistry (OD), on preclinical dental students, by focusing on user performance and self-assessment. Twenty voluntary unpaid preclinical dental students, with different background experience, were enrolled for this study. After the completion of an informed consent, a demographic questionnaire, and being introduced to the prototype (on the first testing session), three testing sessions followed (S1, S2, S3). Each session involved the following steps: (I) free experimentation; (II) task execution; S3 also included (III) completion of questionnaires associated with the experiment (total of 8 Self-Assessment Questions (SAQ)); and (IV) guided interview. As expected, drill time decreased steadily for all tasks when increasing prototype use, verified by RM ANOVA. Regarding performance metrics (Comparisons by Student's t-test and ANOVA) recorded at S3, in overall, a higher performance was verified for participants with the following characteristics: female, non-gamer, no previous VR experience and with over 2 semesters of previous experience of working on phantom models. The correlation between the participants' performance (drill time), for the four tasks, and user self-assessment evaluation, verified by Spearman's rho analysis, allowed to conclude that a higher performance was observed in students who responded that DENTIFY improved their self perception of manual force applied. Regarding the questionnaires, Spearman's rho analysis showed a positive correlation between the improvement DENTIFY inputs on conventional teaching sensed by students, also enhancing their interest in learning OD, their desire to have more simulator hours and the improvement sensed on manual dexterity. All participating students adhered well to the DENTIFY experimentation. DENTIFY allows for student self-assessment and contributes to improving student performance. Simulators with VR and haptic pens for teaching in OD should be designed as a consistent and gradual teaching strategy, allowing multiplicity of simulated scenarios, bimanual manipulation, and the possibility of real-time feedback to allow for the student's immediate self-assessment. Additionally, they should create performance reports per student to ensure self-perception/criticism of their evolution over longer periods of learning time.
Project description:Hearing aid and cochlear implant (CI) users often struggle to locate and segregate sounds. The dominant sound-localisation cues are time and intensity differences across the ears. A recent study showed that CI users locate sounds substantially better when these cues are provided through haptic stimulation on each wrist. However, the sensitivity of the wrists to these cues and the robustness of this sensitivity to aging is unknown. The current study showed that time difference sensitivity is much poorer across the wrists than across the ears and declines with age. In contrast, high sensitivity to across-wrist intensity differences was found that was robust to aging. This high sensitivity was observed across a range of stimulation intensities for both amplitude modulated and unmodulated sinusoids and matched across-ear intensity difference sensitivity for normal-hearing individuals. Furthermore, the usable dynamic range for haptic stimulation on the wrists was found to be around four times larger than for CIs. These findings suggest that high-precision haptic sound-localisation can be achieved, which could aid many hearing-impaired listeners. Furthermore, the finding that high-fidelity across-wrist intensity information can be transferred could be exploited in human-machine interfaces to enhance virtual reality and improve remote control of military, medical, or research robots.
Project description:Current surgical robotic systems are teleoperated and do not have force feedback. Considerable practice is required to learn how to use visual input such as tissue deformation upon contact as a substitute for tactile sense. Thus, unnecessarily high forces are observed in novices, prior to specific robotic training, and visual force feedback studies demonstrated reduction of applied forces. Simulation exercises with realistic suturing tasks can provide training outside the operating room. This paper presents contributions to realistic interactive suture simulation for training of suturing and knot-tying tasks commonly used in robotically-assisted surgery. To improve the realism of the simulation, we developed a global coordinate wire model with a new constraint development for the elongation. We demonstrated that a continuous modeling of the contacts avoids instabilities during knot tightening. Visual cues are additionally provided, based on the computation of mechanical forces or constraints, to support learning how to dose the forces. The results are integrated into a powerful system-agnostic simulator, and the comparison with equivalent tasks performed with the da Vinci Xi system confirms its realism.
Project description:Gait analysis is a technique that is used to understand movement patterns and, in some cases, to inform the development of rehabilitation protocols. Traditional rehabilitation approaches have relied on expert guided feedback in clinical settings. Such efforts require the presence of an expert to inform the re-training (to evaluate any improvement) and the patient to travel to the clinic. Nowadays, potential opportunities exist to employ the use of digitized "feedback" modalities to help a user to "understand" improved gait technique. This is important as clear and concise feedback can enhance the quality of rehabilitation and recovery. A critical requirement emerges to consider the quality of feedback from the user perspective i.e. how they process, understand and react to the feedback. In this context, this paper reports the results of a Quality of Experience (QoE) evaluation of two feedback modalities: Augmented Reality (AR) and Haptic, employed as part of an overall gait analysis system. The aim of the feedback is to reduce varus/valgus misalignments, which can cause serious orthopedics problems. The QoE analysis considers objective (improvement in knee alignment) and subjective (questionnaire responses) user metrics in 26 participants, as part of a within subject design. Participants answered 12 questions on QoE aspects such as utility, usability, interaction and immersion of the feedback modalities via post-test reporting. In addition, objective metrics of participant performance (angles and alignment) were also considered as indicators of the utility of each feedback modality. The findings show statistically significant higher QoE ratings for AR feedback. Also, the number of knee misalignments was reduced after users experienced AR feedback (35% improvement with AR feedback relative to baseline when compared to haptic). Gender analysis showed significant differences in performance for number of misalignments and time to correct valgus misalignment (for males when they experienced AR feedback). The female group self-reported higher utility and QoE ratings for AR when compared to male group.
Project description:Existing haptic actuators are often rigid and limited in their ability to replicate real-world tactile sensations. We present a wearable haptic artificial muscle skin (HAMS) based on fully soft, millimeter-scale, multilayer dielectric elastomer actuators (DEAs) capable of significant out-of-plane deformation, a capability that typically requires rigid or liquid biasing. The DEAs use a thickness-varying multilayer structure to achieve large out-of-plane displacement and force, maintaining comfort and wearability. Experimental results demonstrate that HAMS can produce complex tactile feedback with high perception accuracy. Moreover, we show that HAMS can be integrated into extended reality (XR) systems, enhancing immersion and offering potential applications in entertainment, education, and assistive technologies.