Table of Contents


3D Deformable Tracking of a Textured Thread (JHU, 2012)



This video shows 3D thread tracking experiments using a model based on non-uniform rationale B-splines and a thread textured using a 1D color template.


Spatio-temporal Registration of Multiple Trajectories (JHU, 2011)



This video shows the synchronous replay of three independent repetitions of a knot-tying task using the da Vinci robot's tools. In spite of their different orientations and variations, the trajectories of the tools have been aligned automatically both in space and time for better analysis and processing.


Automatic Scissors: 3rd Arm Assistance for Tele-Surgery (JHU, 2011)



This video shows real-time experiments with an “automatic scissors” command, illustrating the concept of automatic third arm assistance during robotic tele-surgery. The thread is tracked in 3D using discrete optimization.


Gesture-based Manipulation of a Surgical Tool using a Kinect (JHU, 2011)



This video illustrates a da Vinci-Kinect hack in which a Kinect was used to remotely control a surgical tool from a da Vinci robot to perform fine manipulations such as needle insertions. Lack of depth perception due to the use of a 2D screen in this setup makes the manipulation particularly delicate.


Sharing Control: Human Machine Collaborative Surgery (JHU, 2010)



This video illustrates Human-Machine Collaboration (shared control) between an operator and a robot during a tele-operated task. When the label “Manual” is displayed, the tools are operated manually. Using contextual recognition, several sub-tasks are automated. This is indicated by the label “Auto”, during which time the hands of the operator are not moving. The trajectories of the automated motions are also displayed in the field of view of the operator with a spatial offset. The system is illustrated using a non-commercial da Vinci robot on a suturing task.


Control of a Surgical Instrument with an Omni Device (JHU, 2010)


(If Flash is installed, you can watch a video inside this web page.)


Workflow Analysis using 4D Data (TUM, 2009)


(If Flash is installed, you can watch a video inside this web page.)


This video illustrates coarse activity recognition in a mock operating room. The activity is captured with a multi-camera system to avoid interfering with the normal workflow. Recognition is performed using hierarchical hidden Markov models and 3D motion flow features computed from a real-time reconstruction of the scene.


CAMP Gladiators (TUM, 2009)


(If Flash is installed, you can watch a video inside this web page.)


Surgical Workflow Analysis in Laparoscopy (TUM, 2008)


(If Flash is installed, you can watch a video inside this web page.)


This video illustrates the automatic recognition of 14 surgical phases taking place during two cases of laparoscopic cholecystectomy. Recognition is performed using a statistical model of the procedure's workflow and information about tool usage during the procedure. Videos from the two cases, which have different durations, have been automatically synchronized for visual comparison. Applications of the recognition such as the triggering of events (eg. calling the next patient) in the operating room are also illustrated.