Author: Jaimie Patterson
PhD student Benjamin Killeen demonstrates Loop-X in the Johns Hopkins University’s Mock OR.
PhD student Benjamin Killeen demonstrates Loop-X in the Johns Hopkins University’s Mock OR.

Researchers at Johns Hopkins’ Whiting School of Engineering and School of Medicine are exploring new frontiers in surgical robotics and human-robot interaction using Brainlab Loop-X Mobile Imaging Robot. JHU is the first academic institution to acquire this cutting-edge technology for research purposes.

Loop-X is a fully robotic mobile X-ray system with six degrees of movement, capable of aligning itself to take precise, high-resolution X-ray images of a patient’s anatomy.

“This makes it ideal for our research on smart operating rooms,” says Benjamin Killeen, a PhD student in the Whiting School’s Department of Computer Science and a member of the Laboratory for Computational Sensing and RoboticsAdvanced Robotics and Computationally AugmenteD Environments (ARCADE) Lab.

“Having Loop-X allows us to conduct world-class research on the future of image-guided surgery in a fraction of the time,” he says. “So many things that used to be time-consuming are taken care of automatically, such as obtaining high-quality 3D scans and registering images to the real world.”

Under the direction of principal investigator Mathias Unberath, the new John C. Malone Assistant Professor of Computer Science as of July 1, researchers in the ARCADE Lab are using Loop-X to conduct cutting-edge research and inform the design of intelligent operating rooms.

“We’re excited to team up with the ARCADE Lab,” says Philipp Steininger, the executive director of research at Medphoton, a Brainlab company and the developer of Loop-X Mobile Imaging Robot. “It’s breathtaking to see how quickly the talented researchers can integrate our imaging robot with other novel technologies, thereby paving the way for the future of efficient human-robot interaction in the surgical theater.”

The Loop-X system set up in Hackerman Hall.

Loop-X Mobile Imaging Robot in Hackerman Hall.

In addition to creating software that simulates the data needed to develop surgical AI algorithms, ARCADE researchers are using machine learning and Loop-X’s robotic imaging capabilities to develop an autonomous, integrated system that can automatically take and interpret X-ray scans, with the ultimate goal of simplifying complicated procedures like pelvic fracture fixation.

They have also explored how augmented reality and voice commands can supplement Loop-X’s controls to make its use even more intuitive, thereby reducing the training time needed before surgeons can integrate the system into their operating rooms.

Loop-X is able to assist the researchers in creating intelligent systems that can perceive their surgical environment, understand high-level surgical tasks, and plan and execute image-guided surgery actions more effectively than previous systems, according to Killeen.

“This system is the core embodiment of an AI platform capable of acquiring and interpreting X-ray images,” he says.

“We’re very glad to see that our open API enables cooperating research labs to explore more futuristic scenarios and prototypes,” adds Steininger. “We’re already exploring how to integrate these advancements into our products.”

Members of the ARCADE Lab will use Loop-X—currently set up in the Mock OR in Hackerman Hall—to continue their research on expanding the capabilities of smart operating rooms.

“Working with Loop-X’s capabilities as a baseline, we’ve had to push ourselves further to imagine the future of AI-assisted technology in the operating room,” says Killeen. “If this already exists, what can we build that’s coming next?”