Name: Ali Ebrahimi
School: Johns Hopkins University
Project: Design and simulation of intelligent control algorithms for bimanual robot-assisted retinal surgery training system
Research Advisor: Dr. Iulian Iordachita
Retinal microsurgery remains one of the most demanding surgical procedures, involving ultra-fine vein manipulations. In such intricate procedures, which are performed bimanually using two surgical instruments, surgeon hand tremor may cause severe injuries to the eye. Advancements in robotic assistance for eye surgery (Steady-Hand Eye Robots developed at the Johns Hopkins University) have proved beneficial in reducing hand tremor by providing steady and robust surgical tool manipulation. However, sufficient sensing capabilities and smart control methods should be integrated with the robots to ensure their safe performance in the confined area of the eye. In order to enable safe bimanual robot-assisted eye surgery, we first design and simulate hybrid force/position control algorithms by considering various safety aspects for manipulation of two robots inside the eye. We will then develop hardware and software infrastructures for obtaining a bimanual system of two robots and build smart multi-function surgical instruments to boost robots sensing capabilities. We will implement the designed control strategies on the developed system and train clinicians in obtaining intuitive skills for bimanual robot-assisted eye surgery. We anticipate that utilizing multi-function force-sensing tools in conjunction with two cooperative robots for eye surgery could enable safe, precise, and semi-autonomous retinal surgeries.
Name: Gaojian Huang
School: Purdue University
Project: Using advanced driving simulation and vibrotactile cues to train older drivers to interact with next-generation autonomous vehicles
Research Advisor: Dr. Brandon Pitts
Adults aged 65 years and older are the fastest growing age group worldwide. Next-generation autonomous vehicles are expected to support the mobility and independence of the aging population. However, the initiative will only be advantageous to older drivers if they possess the right knowledge about and are comfortable with how these systems operate. For the next decade, only partially or conditional autonomous vehicles will exist, meaning that drivers will need to occasionally take back manual control from these vehicles when they reach their design limits or malfunction. This complex takeover process requires the utilization of perceptual, cognitive, and physical resources and suggests that training may be needed for (older) drivers to successfully complete a takeover transition. Therefore, the overall goal of this project is to use advanced driving simulation to train older adults on how to interact with semi-autonomous vehicles, including interpreting information from in-vehicle interfaces and performing takeover tasks. We expect that the project will help older drivers to become knowledgeable about the capabilities and limitations of future autonomous vehicles. This can, in turn, promote improved transportation and public safety.
Name: Julia Juliano
School: University Southern California
Project: Neural mechanisms of head-mounted display virtual reality motor learning and transfer to the real world
Research Advisor: Dr. Sook Lei Liew
The use of head-mounted display virtual reality (HMD-VR) in motor rehabilitation has been growing exponentially over recent years. Motor rehabilitation interventions using HMD-VR are only effective when the motor skills learned in HMD-VR transfer to the real world. However, there is conflicting research suggesting that motor skills learned in an HMD-VR environment may or may not transfer from the immersive virtual environment to the real world. What is lacking is a clear explanation or potential mechanism for why HMD-VR motor transfer occurs in some cases but not others. Without such information, HMD-VR cannot be harnessed effectively to promote motor rehabilitation for clinical populations, such as individuals after stroke.
The purpose of this dissertation project is to identify neural mechanisms involved in the transfer of HMD-VR motor learning to the real world and to examine whether manipulating these neural correlates could facilitate HMD-VR motor transfer. The results of this project are expected to have an important positive impact because they will provide specific neural targets to improve transfer of motor learning in HMD-VR to the real world and provide basic science to guide the designs of future emerging technology applications.
Name: Christopher Yang
School: Johns Hopkins University
Project: Investigating the computations underlying complex motor skill learning
Research Advisor: Dr. Adrian Haith
I am investigating how the brain learns to perform complex motor skills, such as playing the piano or driving a car. Complex tasks may require up to decades of intensive practice to master. However, in the laboratory, the tasks we use to study motor learning can often be learned on the order of minutes. Recent studies suggest these two types of tasks are learned in qualitatively different ways, raising questions as to whether real-world skill learning can be understood by studying common laboratory tasks. To bridge this gap, our lab has devised a task where people must learn to control a cursor using a complex, bimanual hand-to-cursor mapping. This task captures several qualitative features of real-world skill learning, and using control theoretic approaches, we are characterizing how the brain’s motor control capabilities change over the course of learning. This project will provide critical insights into how real-world skills like flying a plane or performing surgery are learned. These insights can be used to help design principled and effective training protocols for a wide variety of motor skills.
If you would like to find out more about our Link Foundation Modeling, Simulation and Training Fellows and projects that have been funded in the field of Modeling, Simulation and Training by the Link Foundation, please visit the Link Modeling, Simulation and Training webpage at http://www.linksim.org/.