When you enter Operating Theatre 3 of the new central operating department at Pius Hospital Oldenburg, Germany, the first thing you notice is the large, three-part mural on the back wall: a dreamlike landscape of white dunes, a tranquil sea and blue sky. “We believe that an attractive environment significantly improves the workplace atmosphere,” says Dirk Weyhe, Professor of Visceral Surgery at the University and director of the University Clinic for Visceral Surgery at Pius Hospital Oldenburg. And that, he stresses, is indispensable for the success of the operations that are performed here on a daily basis.
However, the atmosphere in the operating theater is just one aspect of a comprehensive plan to make internal organ surgery safer for patients. Weyhe and his team are integrating various new technologies—from intelligent lighting and voice assistance systems to augmented and virtual reality (AR and VR)—to achieve this goal. Several computers and data cables as thick as arms are embedded in the walls of the new operating theater, which also features no less than seven strategically positioned monitors and several cameras, as well as a powerful Wi-Fi connection. “Complex human-machine interactions will be integral to the operating theaters of the future,” the surgeon explains. Rather than replacing humans, Weyhe believes, technology should complement and optimize their capabilities. He sees potential for new technologies in both the planning and implementation of surgical procedures, as well as numerous applications in medical training and continuing education, including anatomy courses using VR headsets and realistic organ models for practicing surgical procedures.
Weyhe is working with his colleagues Dr. Verena Uslar, Dr. Daniela Salzmann, and Dr. Timur Cetin to turn these plans into reality. The research team at the University Clinic for Visceral Surgery is a partner in various projects for developing and testing new technologies. In addition, the researchers are examining the impact of the innovations on workload and stress levels in operating room staff—a question that has been little researched. “The overarching research topic in our group is patient safety,” Weyhe emphasizes. Because for all the top-level expertise, conscientiousness and constant advances in medicine, operations don’t always go according to plan—with potentially negative consequences for the patient. Reducing the frequency of such adverse events is the department’s declared mission.
Poor illumination leads to errors
Another factor that has received little attention is the lighting in operating theaters. “It’s clear that poor illumination of the surgical site can lead to errors, but there are hardly any studies on this,” says Weyhe. A typical problem is that doctors and nurses move around during an operation, meaning that the lighting conditions are constantly changing. Weyhe and his team are involved in the SmartOT (Smart Lighting in Operating Theaters) project led by the University of Bremen, which is working to find specialized solutions.
With funding from the German Federal Ministry of Education and Research (BMBF), the project partners are jointly developing a lighting system that eliminates shadows autonomously. Conventional surgical lamps are replaced with light arrays on the ceiling that can be controlled via gestures and voice commands. For fine-tuning, individual sections switch on and off automatically. “This allows the system to be operated in a completely sterile manner,” explains Timur Cetin, who leads the Oldenburg subproject. A prototype of the system, which is controlled by sensors, depth cameras and artificial intelligence (AI), will be used in the Pius Hospital’s new operating theater for surgical training, which is due to open its doors by the end of the year. “We have been tasked with creating a system that works in real-life conditions,” Cetin says. The research team at the University of Oldenburg analyzed the requirements for a smart lighting system at the start of the project and is currently evaluating the usability of the prototypes.
Integrating virtual reality into surgical training
The innovations under development in the VIVATOP project (Versatile Immersive Virtual and Augmented Tangible OP project), which is led by Daniela Salzmann, a clinical scientist at the University of Oldenburg, could bring far more radical changes to the operating room. The researchers in this project, which is also funded by the BMBF, are investigating how virtual reality, augmented reality and 3D printing can be integrated into surgical training, surgery planning and day-to-day surgical procedures at hospitals. The project is led by Professor Rainer Malaka from the Digital Media Lab at the University of Bremen. Weyhe’s research group as well as other research institutions and partners from industry are also involved.
Weyhe uses a 3D printed liver to demonstrate where the technology is heading. The model is made of transparent hard plastic. A tumor, several metastases and the boundaries between the various segments of the liver and blood vessels are highlighted in different colors. “What makes this special is that the print is patient-specific,” Weyhe says. The model was made using data from a patient’s computed tomography scan. “This gives you a much better three-dimensional sense of the location of a tumor than you get from a two-dimensional CT image,” he explains. So far doctors have had to construct a three-dimensional image of the organ in their minds, on the basis of the sectional images provided by computed tomography—something that requires a great deal of experience, for example when it comes to detecting abnormalities in blood vessel pathways. In the 3D models produced by the Fraunhofer Institute for Digital Medicine MEVIS in Bremen, abnormalities are apparent immediately. “It is vital to be aware of such vascular variations in preoperative planning,” Weyhe emphasizes.
Another technology that doctors aim to use for planning surgical procedures is VR headsets that fully immerse the user in an artificial, three-dimensional world. One of the goals of the VIVATOP project is to enable experts in different locations to come together in a virtual space to view and discuss patient data. The VR headsets used for this purpose feature head-mounted displays and completely cover the eyes. They show an operating room in which wearers can move around and operate various virtual surgical instruments via two remote controls. VR technology can be used for example to simulate the surgical removal of parts of a liver, known as liver resection. The 3D organ model in the virtual world can be rotated and manipulated in real time and also used for detailed planning—such as going through the individual steps of a surgical procedure. In addition, the tool can be used to determine the diameter and volume of tumors or removed liver tissue.
Rotating organ holograms by hand
Mixed reality or AR glasses can also be used to view 3D images. The difference is that with these devices the normal environment remains visible. With the HoloLens headsets used in the VIVATOP project, the wearer can rotate, move or enlarge the organ holograms projected into their field of view simply by moving their hands. “We can project these images onto the real organ during an operation to get a better idea of the location of a tumor,” says Weyhe, who has already tested the method repeatedly. The software developed for this is already being used successfully in liver surgery, he adds.
The project team is currently working on adding sensors and other technology to lifelike, patient-specific organ models created using 3D printing so that they can be transferred to the virtual world. Users hold and touch the soft models with their hands while looking at the organ through the VR headset at the same time. “I would never have believed it, but the haptic impression greatly enhances mental immersion in the virtual world,” says Weyhe.
The VIVATOP team plans to integrate the various technologies into a web-based training program which, Weyhe believes, could be hugely beneficial in training future surgeons. The team also plans to use realistic organ models as training objects for residents to practice surgical procedures such as electrocautery or high-frequency surgery later on.
Improving understanding of anatomical relationships
VR technologies could also be useful for improving medical students’ understanding of anatomical relationships. This was demonstrated in two studies carried out by the team using a virtual “anatomy atlas” developed at the University of Bremen. “The atlas consists of a virtual operating room and a human body,” Verena Uslar explains. Users can dissect the model, cut out organs and expose muscles—all in virtual reality.
The researchers selected students from two tenth grade classes with no prior medical knowledge as test subjects. One group was tasked with learning about anatomical relationships in the conventional way, using a textbook, while the other used the VR atlas. “In the first study it was already apparent that the VR group learned faster, made fewer mistakes and had a lot more fun,” Weyhe says. In the second study, the team tested how much the students could recall of what they had learned four months earlier—again with significantly better results for the VR group.
In view of all these innovations, Weyhe and his team believe it is also important to consider the impact of human-machine interactions on surgical team. In the case of AI-controlled operating room lighting, the researchers expect the technology to reduce workload and stress. To test whether this is indeed the case, they are using a questionnaire developed by the US space agency NASA known as the TLX score or Task Load Index which measures factors such as mental stress, temporal stress and frustration levels.
Measuring stress with a mobile EEG
The team is also employing neurophysiological methods, for which it is working closely with the University’s Department of Neuropsychology headed by Professor Stefan Debener and Dr. Martin Bleichner. The group is among the first one to use a mobile EEG (electroencephalogram) device for measuring brain waves in their daily work. “Together we are establishing a system that can be used to measure the additional stress caused by VR and AR technologies,” says Weyhe. Studies using the EEG device are currently in the planning phase. Another project will focus on noise pollution in the operating room—”a huge topic”, as Weyhe emphasizes. He considers it a happy coincidence that the Oldenburg neuropsychologists are already studying exposure to noise pollution in daily life, which means that thanks to the portable EEG device they will be perfectly equipped to examine its impact in the operating room, too.
The researchers first realized that new technologies can increase stress in operating room personnel in a study investigating minimally invasive procedures in which camera images from inside the body are transferred to a screen and converted into 3D images via 3D glasses. They found that the procedure causes eyestrain because the 3D image appears to be behind the monitor, and the eyes are forced to constantly adjust to different distances, which is very tiring. “The problem can be easily solved by placing the monitor at least two meters away from nursing staff,” Weyhe explains.
Source: Read Full Article