The browser you are using is not supported by this website. All versions of Internet Explorer are no longer supported, either by us or Microsoft (read more here: https://www.microsoft.com/en-us/microsoft-365/windows/end-of-ie-support).

Please use a modern browser to fully experience our website, such as the newest versions of Edge, Chrome, Firefox or Safari etc.

Vision based motion tracking of surgical instruments in 3D space

AI Lund lunch seminar 6 October 2021

Title: Vision based motion tracking of surgical instruments in 3D space

Speaker: Maj Stenmark, Computer Science and Region Skåne

When: 6 October at 12.00-13.15

Where: Online

Abstract

Motion tracking during live surgeries may be used to assess surgeons’ intra-operative performance, provide feedback, and predict outcome. Current assessment protocols rely on human observations, controlled laboratory settings, or tracking technologies not suitable for live operating theatres. In this study, a novel method for motion tracking of live open-heart surgery was developed and evaluated.

In order to track and record the motion of the instruments we used a 3D-printed ‘tracking dies’ with miniature markers were fitted to DeBakey forceps. The surgical field was recorded with a video camera mounted above the operating table. Software was developed for tracking the die from the recordings. The system was tested on five open-heart procedures. Surgeons were asked to report subjective system related concerns during live surgery and assess the weight of the die on a blind test. The accuracy of the system was evaluated against ground truth data generated by a robot.

The vision-based motion tracking system was applicable for live surgeries with negligible inconvenience to the surgeons. Motion data was extracted with acceptable accuracy and speed at low computational cost.