Abstract: Capsule endoscopes have gained popularity over the last decade as minimally invasive devices for diagnosing gastrointestinal abnormalities such as colorectal cancer. While this technology offers a less invasive and more convenient alternative to traditional scopes, these capsules are only able to provide observational capabilities due to their passive nature. With the addition of a reliable mobility system and a real-time navigation system, capsule endoscopes could transform from observational devices into active surgical tools, offering biopsy and therapeutic capabilities and even autonomous navigation in a single minimally invasive device. In this work, a vision system is developed to allow for autonomous lumen center tracking and haustral fold identification and tracking during colonoscopy. This system is tested for its ability to accurately identify and track multiple haustral folds across many frames in both simulated and  in vivo video, and the lumen center tracking is tested onboard a robotic endoscope platform (REP) within an active simulator to demonstrate autonomous navigation. In addition, real-time localization is demonstrated using open source ORB-SLAM2. The vision system successfully identified 95.6% of Haustral folds in simulator frames and 70.6% in in vivo frames and false positives occurred in less than 1% of frames. The center tracking algorithm showed in vivo center estimates within a mean error of 6.6% of physician estimates and allowed for the REP to traverse 2 m of the active simulator in 6 minutes without intervention.

Prendergast, J.M., Formosa, G.A., Heckman, C.R., Rentschler, M.E., “Autonomous Localization, Navigation and Haustral Fold Detection for Robotic Endoscopy,” IEEE International Conference on Intelligent Robots and Systems, Madrid, Spain, October, 2018.

(Downloadable PDF)