A Ubiquitous Context-Aware Environment for Surgical Training

Author/Creator ORCID

Date

2007-08-06

Department

Program

Citation of Original Publication

Patti Ordonez, Palanivel Andiappan Kodeswaran, Vladimir Korolev, Wenjia Li, Onkar Walavalkar, Anupam Joshi, Tim Finin, Yelena Yesha, and Ivan George, A Ubiquitous Context-Aware Environment for Surgical Training, Proceedings of the 4th Annual International Conference on Mobile and Ubiquitous Systems, 2007, DOI: 10.1109/MOBIQ.2007.4451029

Rights

This item is likely protected under Title 17 of the U.S. Copyright Law. Unless on a Creative Commons license, for uses protected by Copyright Law, contact the copyright holder or the author.
© 2007 IEEE

Abstract

The age of technology has changed the way that surgeons are being trained. Traditional methodologies for training can include lecturing, shadowing, apprenticing, and developing skills within live clinical situations. Computerized tools which simulate surgical procedures and/or experiences can allow for “virtual” experiences to enhance the traditional training procedures that can dramatically improve upon the older methods. However, such systems do not to adapt to the training context. We describe a ubiquitous computing system that tracks low-level events in the surgical training room (e.g. student locations, lessons completed, learning tasks assigned, and performance metrics) and from these derive the training context. This can be used to create an adaptive training system.