Searching for just a few words should be enough to get started. If you need to make more complex queries, use the tips below to guide you.
Article type: Research Article
Authors: K. P, Ajitha Gladisa | D, Roja Ramanib | N, Mohana Suganthic | P, Linu Babud; *
Affiliations: [a] Department of Information Technology, CSI Institute of Technology, Thovalai, India | [b] Department of Computer Science and Engineering, New Horizon College of Engineering, Bengaluru, India | [c] Department of Computer Science and Engineering, Vel Tech Rangarajan Dr. Sagunthala R&D Institute of Science and Technology, Chennai, India | [d] Department of Electronics and Communication Engineering, IES College of Engineering, Thrissur, India
Correspondence: [*] Corresponding author: Linu Babu P, Department of Electronics and Communication Engineering, IES College of Engineering, Thrissur, India. E-mail: [email protected].
Abstract: BACKGROUND: Gastrointestinal tract (GIT) diseases impact the entire digestive system, spanning from the mouth to the anus. Wireless Capsule Endoscopy (WCE) stands out as an effective analytic instrument for Gastrointestinal tract diseases. Nevertheless, accurately identifying various lesion features, such as irregular sizes, shapes, colors, and textures, remains challenging in this field. OBJECTIVE: Several computer vision algorithms have been introduced to tackle these challenges, but many relied on handcrafted features, resulting in inaccuracies in various instances. METHODS: In this work, a novel Deep SS-Hexa model is proposed which is a combination two different deep learning structures for extracting two different features from the WCE images to detect various GIT ailment. The gathered images are denoised by weighted median filter to remove the noisy distortions and augment the images for enhancing the training data. The structural and statistical (SS) feature extraction process is sectioned into two phases for the analysis of distinct regions of gastrointestinal. In the first stage, statistical features of the image are retrieved using MobileNet with the support of SiLU activation function to retrieve the relevant features. In the second phase, the segmented intestine images are transformed into structural features to learn the local information. These SS features are parallelly fused for selecting the best relevant features with walrus optimization algorithm. Finally, Deep belief network (DBN) is used classified the GIT diseases into hexa classes namely normal, ulcer, pylorus, cecum, esophagitis and polyps on the basis of the selected features. RESULTS: The proposed Deep SS-Hexa model attains an overall average accuracy of 99.16% in GIT disease detection based on KVASIR and KID datasets. The proposed Deep SS-Hexa model achieves high level of accuracy with minimal computational cost in the recognition of GIT illness. CONCLUSIONS: The proposed Deep SS-Hexa Model progresses the overall accuracy range of 0.04%, 0.80% better than GastroVision, Genetic algorithm based on KVASIR dataset and 0.60%, 1.21% better than Modified U-Net, WCENet based on KID dataset respectively.
Keywords: Gastrointestinal tract, wireless capsule endoscopy, mobile network, structural and statistical features, deep belief network
DOI: 10.3233/THC-240603
Journal: Technology and Health Care, vol. 32, no. 6, pp. 4453-4473, 2024
IOS Press, Inc.
6751 Tepper Drive
Clifton, VA 20124
USA
Tel: +1 703 830 6300
Fax: +1 703 830 2300
[email protected]
For editorial issues, like the status of your submitted paper or proposals, write to [email protected]
IOS Press
Nieuwe Hemweg 6B
1013 BG Amsterdam
The Netherlands
Tel: +31 20 688 3355
Fax: +31 20 687 0091
[email protected]
For editorial issues, permissions, book requests, submissions and proceedings, contact the Amsterdam office [email protected]
Inspirees International (China Office)
Ciyunsi Beili 207(CapitaLand), Bld 1, 7-901
100025, Beijing
China
Free service line: 400 661 8717
Fax: +86 10 8446 7947
[email protected]
For editorial issues, like the status of your submitted paper or proposals, write to [email protected]
如果您在出版方面需要帮助或有任何建, 件至: [email protected]