Repository logo
Institutional Digital Repository
Shreenivas Deshpande Library, IIT (BHU), Varanasi

Data reduction technique for capsule endoscopy

Loading...
Thumbnail Image

Date

Journal Title

Journal ISSN

Volume Title

Publisher

Abstract

The advancements in the field of IoT and sensors generate a huge amount of data. This huge data serves as an input to knowledge discovery and machine learning producing unprecedented results leading to trend analysis, classification, prediction, fraud and fault detection, drug discovery, artificial intelligence and many more. One such cutting-edge technology is capsule endoscopy (CE). CE is a noninvasive, non-sedative, patient-friendly and particularly child-friendly alternative to conventional endoscopy for diagnosis of gastrointestinal tract diseases. However, CE generates approximately 60000 images from each video. Further, when computer vision and pattern recognition techniques are applied to CE images for disease detection, the resultant data called feature vector sizes to 181548 for one image. Now a machine learning task for computer-aided disease detection would include nothing less than thousands of images leading to highly data intensive task. Processing such huge amount of data is an expensive task in terms of computation, memory and time. Hence, a data reduction technique needs to be employed in such a way that minimum information is lost. It is important to note that features must be discriminative and thus redundant or correlative data is not very useful. In this study, a data reduction technique is designed with the aim of maximizing the information gain. This technique exhibits high variance and low correlation to achieve this task. The data reduced feature vector is fed to a computer based diagnosis system in order to detect ulcer in the gastrointestinal tract. The proposed data reduction technique reduces the feature set to 98.34%. © 2020, Springer Nature Singapore Pte Ltd.

Description

Keywords

Citation

Collections

Endorsement

Review

Supplemented By

Referenced By