Localisation of Colorectal Polyps by Convolutional Neural Network Features Learnt from White Light and Narrow Band Endoscopic Images of Multiple Databases

Published: 01 Jan 2018, Last Modified: 13 May 2025EMBC 2018EveryoneRevisionsBibTeXCC BY-SA 4.0
Abstract: Algorithms for localising colorectal polyps have been studied extensively; however, they were often trained and tested using the same database. In this study, we present a new application of a unified and real-time object detector based on You-Only-Look-Once (YOLO) convolutional neural network (CNN) for localizing polyps with bounding boxes in endoscopic images. The model was first pre-trained with non-medical images and then fine-tuned with colonoscopic images from three different databases, including an image set we collected from 106 patients using narrow-band (NB) imaging endoscopy. YOLO was tested on 196 white light (WL) images of an independent public database. YOLO achieved a precision of 79.3% and sensitivity of 68.3% with time efficiency of 0.06 sec/frame in the localization task when trained by augmented images from multiple WL databases. In conclusion, YOLO has great potential to be used to assist endoscopists in localising colorectal polyps during endoscopy. CNN features of WL and NB endoscopic images are different and should be considered separately. A large-scale database that covers different scenarios, imaging modalities and scales is lacking but crucial in order to bring this research into reality.
Loading

OpenReview is a long-term project to advance science through improved peer review with legal nonprofit status. We gratefully acknowledge the support of the OpenReview Sponsors. © 2025 OpenReview