September 2003

Link Foundation Home Page

Newsletter Home Page

Link Fellowship Awardees for 2003

Inside this Issue

FEATURES

The First Annual Link Fellowship Newsletter, Lee Lynd

Link Foundation 50th Anniversary Celebration, David Gouldin

MEET THIS YEARS' FELLOWSHIP RECIPIENTS

Energy (2003-2005)

Simulation and Training (2003-2004)

Ocean Engineering (2003-2004)

NEWS FROM FELLOWS
SIMULATION AND TRAINING

Fellowships awarded annually since 1991.
Program Manager, Frank Cardullo Watson School of Engineering and Applied Science, SUNY Binghamton. cardullo@binghamton.edu
Program Administrator Marybeth Thompson, UCF/IST www.ist.ucf.edu/linkfoundation.htm

Gilbert Barrett
School of Electrical Engineering and Computer Science, University of Central Florida
Advisor: Professor Avelino Gonzalez
Email Address: gilbarrett@hotmail.com

Title:Exploring Emergent Behaviors of Collaborating Agents within a Simulated Environment

Collaborative behavior is a critical issue in modeling teamwork of autonomous agents in Computer Generated Forces (CGF) Currently numerous paradigms exist for modeling teamwork or collaboration in a multi-agent environment. None of these, however, provide a general means to feasibly model human tactical behavior. The Context Based Reasoning (CxBR) paradigm could potentially provide a viable method for modeling collaborative behaviors among autonomous agents. My research during the past year includes adding a Formation Class to the CxBR Framework, which allows for chain of command representations, bilateral communication throughout the ranks, and unification of autonomous agents through representation as a formation.

Context-based Intelligent Knowledge Acquisition (CITKA) is an application concerned with automated knowledge acquisition used to build tactical models in CxBR. Work to date has resulted in an application that elicits knowledge from a subject matter expert (SME) and uses this knowledge to specify a context-based model. My intention is to expand the current application to provide a means of specifying collaborative and team models.

Recently interest in emergent behaviors of simple agents such as those used in SWARM or ANTS has grown immensely. Yet little has been done regarding emergent behaviors of more complex agents, such as simulated human tactical agents. As part of my research, I would like to observe and record notable emergent behaviors of collaborating agents. Of particular interest will be an attempt to formalize the necessary level of individual member’s motivation, planning, and management in relation to the accomplishment of a team goal and correspondingly see if this is predictable based on human characteristics and expected real-world performance.

The proposed research will greatly enhance modeling human collaborative behaviors, such as those used in tactical domains involving teamwork. Major areas to be addressed during the course of this work include automated knowledge acquisition, emergent behaviors, and deviant behaviors.



Jason J. Corso
School of Computing, The Johns Hopkins University
Advisor: Professor Greg Hager
jcorso@cs.jhu.edu

Title: Vision-based Techniques for Dynamic Collaborative Mixed Realities

The future of advanced simulation and training will rely heavily on augmented and mixed reality (AR/MR) environments. In immersive AR/MR environments, a pair of cameras act as the user's eyes. The video-stream is processed through a computer during which a synthetic stream of imagery is composited into the video-stream thereby "augmenting" reality. The resultant video-stream is then rendered into the user's head-mounted display.

The project entitled "Vision-Based Techniques for Dynamic Collaborative Mixed Realities" aims at developing a general set of vision-based techniques that will enable dynamic collaborative augmented and mixed realities for use in a multitude of simulation and training application areas. The techniques should not be constrained by any knowledge of user quantity or behavior a priori. The two problems this work strives to solve are (1) the accurate registration and real-time depth composition of real and virtual image streams and (2) the development of vision-based interface components allowing users to dynamically join and exit without expensive and error-prone tracking. This work will address these problems allowing for multiple, dynamic users to collaborate in immersive simulations that are accurately registered and composited.



Felix Hamza-Lup
School of Electrical Engineering and Computer Science, University of Central Florida
Advisors: Prof. Jannick Rolland & Prof. Charles Hughes
Email Address:

Title: Augmented Reality System for Medical Training and Simulation Augmented Reality (AR) systems describe the class of systems that use computers to overlay virtual information on the real world.

The project aims at the development of two real-time applications: the Remote Medical Diagnostics through Simulation, and the Medical Simulation Procedures Training. Remote Medical Diagnostics through Simulation is an AR application that will allow medical personnel to diagnose a patient at a remote location by visualizing the simulated anatomical threedimensional data associated with that patient. The application functionality will be guaranteed by a set of algorithms for scene synchronization and 3D visualization on a local area network. Medical Simulation Procedures Training is another AR application which consists of dynamically superimposing computer generated three-dimensional models of the human internal anatomy onto a Human Patient Simulator (HPS) using enhanced algorithms for registration and tracking. In the near future this type of AR applications will allow paramedics to practice their skills by providing them with an enhanced visual feedback. The experience gathered from these applications will be embedded in an AR framework. This framework will allow implementation of computer supported training and simulation applications based on augmented reality built on a distributed system infrastructure and will provide a test-bed for future research.



Shuangbao Wang
Computer Science Department, George Mason University
Advisor: Jim X. Chen (Former Link Fellow 94-95)
Email Address: swang3@gmu.edu

Title: Emerging 3D Medical Equipment and Simulation System for Real-time Bone Surgery

3D visualization of the human body has wide application in medical diagnoses, treatment, simulation, and aviation security. There are many techniques that can reconstruct 3D information from images generated by CT and MRI. However, they may be used only with difficulty during a surgical operation. Also the MRI scanner cannot be used if the part scanned contains any metal.

The objective of this research is to use emerging medical technologies and hyper-stereoscopic techniques together with PPL, FFT and FWT to acquire high quality image data and recover the depth information in real-time or in situations where CT and MRI are not suitable to be used.

Research will also focus on direct dual image acquisitions and stereo image generation. An online simulation system will be developed that can enable doctors to collaborate over the internet to perform virtual surgery online and train the medical staffs and students. Full field digital sensors and new wavelet-based image processing algorithms are used to re-construct 3D objects to reduce the inaccuracy caused by scanning the images asynchronously. Plans for a prototype of a new portable medical apparatus and an online simulation system for diagnosis and real-time surgery will be presented.


Last Updated: 8/21/24