Visualization methods for patient monitoring in anesthetic procedures using augmented reality
Motivation
In anesthetic procedures, it is common for the induction to be conducted by an assistant doctor but be supervised by a senior anesthesiologist. In those procedures, especially in time-critical surgeries like in the trauma room, the supervisor may move in such a way so they cannot see the patient monitoring system or other medical devices, or that those devices are obstructed by other staff in the room. Also, even when the devices are in sight, they might not immediately display critical information and it is necessary to stand next to them to assess their status. With an increasing number of staff in the room, not only is freedom of movement constricted but the risk of miscommunication increases due to noise. Wölfl et. al. [1] interviewed medical staff in the trauma room and found “It is too loud” and “there are too many people in the trauma room” to be significant disruptive factors. These factors may lead to deviation from the standard procedure routines in the trauma room, leading to potential mistakes.
Goals
In this project, a system will be developed that displays vital patient information and device status on a Microsoft HoloLens 2 augmented reality headset in real-time.
Feiner et al. [2] propose three kinds of windows for augmented reality applications.
- Surround-fixed: Windows are placed surrounding the user and have no relation to the physical world
- Display-fixed: Windows are positioned at a fixed location within the display, independent of the user’s head orientation.
- World-fixed: The physical world is registered by the headset and windows are fixed to objects or locations in the 3D world.
The monitoring system shall support display-fixed and world-fixed windows for monitoring. Following the implementation, a study shall be conducted to research whether the augmented reality monitoring system helps the supervisor in noticing critical patient vitals and changes on the medical devices, and which of the two visualization methods is better suited for this task.
State of research
There has been some prior research to assist anesthetists by using augmented reality monitoring systems.
Previous research by Schlosser et al. [3] shows that augmented-reality-based patient monitoring caused an improved situational awareness and helped the supervisors notice alarms more easily. They developed a patient monitoring system that allowed anesthetists to monitor vitals from several surgery rooms at the same time. Anesthetists in the study also stated that the monitoring helped them get a better overview of the patient’s vitals when they were not in sight.
In a study about augmented reality user interfaces, Lu et al. [4] proposed several visualization methods for augmented reality applications. Their interfaces included surround-fixed and display-fixed approaches, with the omission of world-fixed windows. In their study, they found that display-fixed user interfaces were better suited for monitoring tasks than surround-fixed.
Liu et al. [5] showed benefits in using AR-headsets in anesthetic prodecures, especially when paired with the conventional monitoring system, as anesthetists still often looked at the regular monitors. They also found several hardware limitations that needed to be fixed before AR could be used in practice, like comfort while wearing and battery life.
While their experiment uses display-fixed windows, they propose researching different ways to display patient data, including world-fixed windows, but this research has not been done yet.
Research by Muensterer et al. [6] using a Google Glass headset showed that anesthetists especially like the ability to use video/photo-documentation without needing to use their hands or quickly looking up information on the headset, but that practical use was still limited by the dated hardware.
Method
Implementation
The headset used in this project is the Microsoft HoloLens 2 Headset. The project will be implemented with the Mixed Reality Toolkit Version 2.4 using the Unity Engine 2019.4.18f1. To access the patient’s vital parameters, the open-source capture-software “VSCapture”[7] is used. Data is recorded from the Phillips IntelliVue patient monitor using a computer running the Capture software and then sent to the HoloLens using the Photon Realtime Networking Plugin. The HoloLens will then display heart rate, blood pressure, blood oxygen level, and “train of four” muscle relaxation level. Also, the current status of the medicine pumps, the gas valve, and the ventilator is displayed.
Experiment
This study will research if a supervisor in an anesthetic introduction will notice mistakes being made by the assistant doctor and whether the use of an augmented reality monitoring system can improve the speed and accuracy of that detection. In addition, it will be investigated whether a world-fixed or display-fixed visualization is better suited for patient monitoring.
Two actors (assistant doctor and anesthetic nurse) will make three mistakes in the procedure on purpose. It will then be measured if the mistake was noticed and if so, how long did it take the supervisors to notice them. For this, we are using a between-groups design to compare the error detection of subjects with AR-monitoring to those without. The mistakes are not fatal for the patient and were selected by Dr. Ole Happel (Anesthetic Doctor at University Clinic Würzburg) to ensure the realism of the mistakes. The study will be held in the simulation center of the University Clinic and the test subjects will all be anesthetic doctors that have already acted as a supervisor before.
References
[1] Wölfl, C., Kotter, J., Trupkovic, T., Grützner, P. A., & Münzberg, M. (2014). „Trauma room time out“ (TRTO): Neues Sicherheitstool zur Verbesserung der Patientensicherheit und Mitarbeiterzufriedenheit im Schockraum. Der Unfallchirurg, 117(1), 83–85. https://doi.org/10.1007/s00113-013-2552-5
[2] Feiner, S., MacIntyre, B., Haupt, M., & Solomon, E. (1993). Windows on the world: 2D windows for 3D augmented reality. Proceedings of the 6th Annual ACM Symposium on User Interface Software and Technology - UIST ’93, 145–155. https://doi.org/10.1145/168642.168657
[3] Schlosser, P. D., Grundgeiger, T., Sanderson, P. M., & Happel, O. (2019). An exploratory clinical evaluation of a head-worn display based multiple-patient monitoring application: Impact on supervising anesthesiologists’ situation awareness. Journal of Clinical Monitoring and Computing, 33(6), 1119–1127. https://doi.org/10.1007/s10877-019-00265-4
[4] F. Lu, S. Davari, L. Lisle, Y. Li and D. A. Bowman, “Glanceable AR: Evaluating Information Access Methods for Head-Worn Augmented Reality,” 2020 IEEE Conference on Virtual Reality and 3D User Interfaces (VR), Atlanta, GA, USA, 2020, pp. 930-939. doi: 10.1109/VR46266.2020.00113
[5] Liu, D., Jenkins, S. A., & Sanderson, P. M. (2009). Patient monitoring with head-mounted displays: Current Opinion in Anaesthesiology, 22(6), 796–803. https://doi.org/10.1097/ACO.0b013e32833269c1
[6] Muensterer, O. J., Lacher, M., Zoeller, C., Bronstein, M., & Kübler, J. (2014). Google Glass in pediatric surgery: An exploratory study. International Journal of Surgery, 12(4), 281–289. https://doi.org/10.1016/j.ijsu.2014.02.003
[7] Karippacheril, John & Ho, Tam. (2013). Data acquisition from S/5 GE Datex anesthesia monitor using VSCapture: An open source.NET/Mono tool. Journal of anaesthesiology, clinical pharmacology. 29. 423-424. 10.4103/0970-9185.117096.
Contact Persons at the University Würzburg
Dr. Florian Niebling (Primary Contact Person)Mensch-Computer-Interaktion, Universität Würzburg
florian.niebling@uni-wuerzburg.de
Dr. Sebastian Oberdörfer (Primary Contact Person)
Mensch-Computer-Interaktion, Universität Würzburg
sebastian.oberdoerfer@uni-wuerzburg.de