Link Foundation Fellowships Newsletter

Inside this Issue

Features

Meet this Year's Fellowship Recipients

Link Fellowship Awardees For 2014

Advanced Simulation and Training

ImageName: Richard Joyce
Department:  PhD Student, Department of Mechanical and Aerospace Engineering
Center for Human/Robotics/Vehicle Integration and Performance
School: University of California, Davis
Project: Rapidly Reconfigurable Research Cockpit
Research Advisor: Dr. Stephen K. Robinson

The Rapidly Reconfigurable Research Cockpit (R3C) is a concept that greatly improves crew training for both spaceflight and aeronautical operations by combining the traditionally separate roles of physical mockup with a functional simulator. We aim to combine the two into a single training environment by utilizing a visual virtual environment as a systems simulation-layer that is overlaid upon a low-fidelity geometric mockup. The user will wear a virtual-reality head-mounted display (HMD) while sitting in the pilot’s seat. In front of them will be a printed 3D cockpit panel that provides only the tactile feedback. The virtual world head-mounted display will respond to the pilot’s actions in the mockup through the use of hand-tracking devices. The two worlds will be linked together with head tracking and optical registration techniques.

This novel combination of mockup and simulator promises a much lower-cost system for training, especially in the mission/vehicle concept stages, where mission-driven changes in cockpit layout and functionality can be accomplished far more cheaply and quickly than with traditional methods. Although the initial study is based upon aerospace vehicles, the concept can be applied to any complex vehicle that demands extensive operator training that the Link Foundation supports, such as robotic surgery.

The recent advancement of certain technology has made this into a technically sound concept at a reasonable cost. A lightweight, high field of view with low latency head tracking HMD is essential to this project, and most notably achieved with the Oculus Rift. The cockpit panels will be 3D printed, which provide the tactile feedback of the mockup. The use of maturing 3D printing technology will allow a rapid reconfiguration of the cockpit panel. In order to close the loop on the simulation, a hand-tracking device will be used to determine where the user is interacting on the panel. Several optical approaches are rapidly improving in this field, such as Leap Motion, 3Gear Systems and Kinect v2. The Myo armband, which reads muscle activity, provides a promising non-optical approach.

The use of virtual/augmented reality in aerospace has been extensive, however our concept of merging an accurate tactile environment with the virtual view is so far untested. A research question that has already surfaced from our proof of concept system is how much visual feedback is required for accurately targeting the correct button. Many research studies have investigated the relative role of proprioceptive and visual feedback for human control of targeted movement, however our unique scenario of targeting a physical tactile object with limited visual feedback has not been extensively studied. The goal of the fellowship year will be to improve our prototype and begin producing research with the R3C concept.

 

Image

Name: Jia Luo
Department: PhD Student, Department of Mechanical and Industrial Engineering
School:  University of Illinois at Chicago 
Project:  Haptics-based Cataract Surgery Simulator
Research Advisor: Dr. Michael Scott

Surgical training using computer-based simulators are being adopted by many medical specialties, mainly because they offer several advantages to traditional training methods involving animals or cadavers. Simulators can provide a more controlled environment for training. Also, practice on simulators allows trainees to perform the same surgical procedures multiple times without incurring extra costs of resources. Finally, computer-based simulators provide instructors an objective assessment of the trainees’ skills to measure their performance and track their progress during multiple training sessions.

My research focuses on the design and implementation of a haptics-based cataract surgery simulator. During the first three years of this project, many surgical steps have been simulated as part of my Ph.D. thesis: corneal incision, capsulorrhexis, and lens grooving. To simulate these procedures, a physics-based eye 3D model was created. This dynamic virtual eye model realistically behaves as it interacts with the virtual surgical instruments manipulated by the trainees using haptic devices.

Early prototypes of these simulated surgical procedures are being tested by faculty and residents at many medical institutions in two continents: the Wilmer Eye Institute at Johns Hopkins University School of Medicine (Baltimore, MD, US), Northwestern University Feinberg School of Medicine (Chicago, IL, US), and King Khaled Eye Specialist Hospital (Riyadh, Saudi Arabia). Based on preliminary feedback, I am currently in the process of enhancing those modules, as well as designing and implementing a phacoemulsification module, which consists of breaking the lens into multiple pieces to be finally removed from the ocular capsule. The main technical challenge of this step is to provide real-time lens deformation as it is touched by the virtual instruments, self- collision detection among all parts of the lens being emulsified, and their elimination as they are suctioned by the phaco hand piece. In this case, the shape and surface of the lens must realistically change according to the forces applied by both haptic devices and the physics engine. In addition, a method of performance assessment needs to be implemented not only to provide guidance during training, but also to validate the simulator effectiveness to improve the residents’ surgical skills.

Among several cataract simulators currently available, this simulator is the only one that incorporates tactile feedback. During a survey performed at Harvard Medical School, the majority of the participating ophthalmologists agreed that the integration of haptic feedback to ophthalmic surgery simulation could provide more realistic operative experience.

In the next phase of this project, a comparison between this simulator and other commercially-available simulators (such as EYESI) will be conducted to evaluate the value of haptics in cataract surgery simulation as well as its potential to improve surgical outcomes.

It is expected that this haptics-based cataract simulator will provide an efficient training tool for medical institutions interested in improving their residents’ tactile and psychomotor skills, reducing the need for expensive pig eyes and plastic models traditionally used for training. Additionally, it will allow the instructors to perform more comprehensive analyses of the residents’ performance over time and evaluate different training strategies to better achieve their goals.