
Amy Blank, Link Fellow 2010, is a Ph.D candidate in the Department of Mechanical Engineering at the Johns Hopkins University, co-advised by Profs. Louis Whitcomb and Allison Okamura in the Laboratory for Computational Sensing and Robotics. Current research interests are in design and control of prosthetic limbs. She is exploring the role of variable limb impedance and the potential benefits of enabling the prosthesis wearer to directly modulate it.

Daniel Mirota received his B.S. degree in Computer Science with honors from Stevens Institute of Technology in 2006. That year he received the ASEE National Co-Op Student of the Year Award for his work on cardiac health evaluation software at Siemens Corporate Research in Princeton, N.J. Since 2006 he has been pursuing a Ph.D. at Johns Hopkins University. In 2009 he won the Medtronic Computer Assisted Surgery Research Award to research endoscopic video to CT registration for augmented reality in the operating room. His interests include computer vision, medical image analysis, as well as computer-aided surgical systems. Currently, his research focuses on computer vision and augmented reality applications for surgical navigation to aid in both surgery and surgical training.

Luv Kohli is currently a Ph.D. student in the Department of Computer Science at the University of North Carolina at Chapel Hill. Prior to coming to the University of North Carolina at Chapel Hill, Mr. Kohli earned both a B.S. (1999) and a M.S. (2002) degree in Computer Science from The George Washington University. Mr. Kohli is a member of the Effective Virtual Environments research project at UNC, headed by Professors Frederick P. Brooks, Jr. and Mary Whitton. He has had summer internships at Advanced Telecommunications Research Institute International, Electronic Arts, and the USC Institute for Creative Technologies.
While a Link Fellow in Advanced Simulation and Training, Mr. Kohli’s research project has been: Remapping Passive Haptics for Deployable Training Systems. A summary of Mr. Kohli’s research follows:
Realistic and technology-rich simulator training is inaccessible to deployed armed forces. Deployable virtual training systems can help maintain combat readiness in the field. Haptic feedback enables users to learn about their environments through touch, but is typically difficult to deploy. Passive haptic feedback (physical mock-ups of virtual objects) is very compelling, but it is also inflexible. Changes made to virtual objects can require time-consuming changes to their physical counterparts. This research will develop and evaluate a new semi-automated, perception-based technique for using low-cost, quickly set-up passive haptics that can be repurposed for different virtual training scenarios (e.g., different aircraft with differently shaped and instrumented cockpits). This technique is meant not as a replacement for full flight simulators, but as a mechanism for continued training while deployed, particularly for mission familiarity and emergency procedures.