Searching for just a few words should be enough to get started. If you need to make more complex queries, use the tips below to guide you.
Article type: Research Article
Authors: Rajendran, Aishwaryaa; * | Ganesan, Sumathia | Rathis Babu, T.K.S.b
Affiliations: [a] Department of Computer Science and Engineering, Faculty of Engineering and Technology, Annamalai University, Annamalai Nagar, Tamil Nadu, India | [b] Department of Computer Science and Engineering, Sridevi Womens Engineering College, Hyderabad, India
Correspondence: [*] Corresponding author. Aishwarya Rajendran, Research Scholar, Department of Computer Science and Engineering, Faculty of Engineering and Technology, Annamalai University, Annamalai Nagar, Tamil Nadu, India. E-mail: [email protected].
Abstract: Brain tumor is observed to be grown in irregular shape and presented deep inside the tissues that led to cancer. Human brain tumor identification and categorization are performed with high latency, but also an essential task for the medical experts. The assistance through the automated diagnosis is generally utilized for the advancement in the diagnosis ability in order to get superior accuracy in brain tumor detection. Although the researches are enhancing the brain tumor detection performance, the highly challenging is to segment the brain tumor since it has variability concerning the tumor type, contrast, image modality and also in other factors. To meet up all the challenges, a novel classification method is introduced using segmentation and machine learning approaches. Initially, the required images are collected from benchmark data sources. The input images are undergone for pre-processing stage, where it is done via “Contrast Limited Adaptive Histogram Equalization (CLAHE) and filtering methods”. Further, the pre-processed imagesare given as input to two classifier models as “Residual Network (ResNet) and Gated Recurrent Unit (GRU)”, in which the model provide the result as normal and abnormal images. In the second part, obtained abnormal image acts an input for segmentation step. In segmentation, it is needed to extract the relevant features by texture and spatial features. The resultant features are subjected for optimizing, where the optimal features are acquired through Adaptive Coyote Optimization Algorithm (ACOA). Then, the extracted features are fed into machine learning model like “Support Vector Machine (SVM), Artificial Neural Network (ANN), and Random Forest (RF)” to render the segmented image. Finally, the hybrid classification named Hybrid ResGRUis developed by integrating the ResNet and GRU, where the hyper parameters are tuned optimally using developed ACOA, thus it is used for classifying the abnormal image that belongs to benign stage or malignant stage. The experimental results are evaluated, and its performance is analyzed by various metrics. Hence, the proposed classification model ensures effective segmentation and classification performance.
Keywords: Brain tumour segmentation and classification, adaptive coyote optimization algorithm, residual network, gated recurrent unit, ensemble machine learning-based tumor segmentation, deep learning-based classification
DOI: 10.3233/JIFS-233546
Journal: Journal of Intelligent & Fuzzy Systems, vol. Pre-press, no. Pre-press, pp. 1-15, 2023
IOS Press, Inc.
6751 Tepper Drive
Clifton, VA 20124
USA
Tel: +1 703 830 6300
Fax: +1 703 830 2300
[email protected]
For editorial issues, like the status of your submitted paper or proposals, write to [email protected]
IOS Press
Nieuwe Hemweg 6B
1013 BG Amsterdam
The Netherlands
Tel: +31 20 688 3355
Fax: +31 20 687 0091
[email protected]
For editorial issues, permissions, book requests, submissions and proceedings, contact the Amsterdam office [email protected]
Inspirees International (China Office)
Ciyunsi Beili 207(CapitaLand), Bld 1, 7-901
100025, Beijing
China
Free service line: 400 661 8717
Fax: +86 10 8446 7947
[email protected]
For editorial issues, like the status of your submitted paper or proposals, write to [email protected]
如果您在出版方面需要帮助或有任何建, 件至: [email protected]