Searching for just a few words should be enough to get started. If you need to make more complex queries, use the tips below to guide you.
Article type: Research Article
Authors: Zhao, Tianhua; b | Qi, Shoulianga; b; * | Yue, Yongc | Zhang, Baihuaa | Li, Jingxud | Wen, Yanhuae | Yao, Yudongf | Qian, Weia | Guan, Yubaoe; *
Affiliations: [a] College of Medicine and Biological Information Engineering, Northeastern University, Shenyang, China | [b] Key Laboratory of Intelligent Computing in Medical Image, Ministry of Education, Northeastern University, Shenyang, China | [c] Department of Radiology, Shengjing Hospital of China Medical University, Shenyang, China | [d] Department of Radiology, The First Affiliated Hospital of Guangzhou Medical University, Guangzhou, China | [e] Department of Radiology, The Fifth Affiliated Hospital of Guangzhou Medical University, Guangzhou, China | [f] Department of Electrical and Computer Engineering, Stevens Institute of Technology, Hoboken, NJ, USA
Correspondence: [*] Corresponding authors: Shouliang Qi, College of Medicine and Biological Information Engineering, Northeastern University, Shenyang, China. Tel.: +86 24 8368 0230, Fax: +86 24 8368 1955; E-mail: [email protected] and Yubao Guan, Department of Radiology, The Fifth Affiliated Hospital of Guangzhou Medical University, Guangzhou, China. E-mail: [email protected].
Abstract: BACKGROUND:Pulmonary granulomatous nodules (GN) with spiculation or lobulation have a similar morphological appearance to solid lung adenocarcinoma (SADC) under computed tomography (CT). However, these two kinds of solid pulmonary nodules (SPN) have different malignancies and are sometimes misdiagnosed. OBJECTIVE:This study aims to predict malignancies of SPNs by a deep learning model automatically. METHODS:A chimeric label with self-supervised learning (CLSSL) is proposed to pre-train a ResNet-based network (CLSSL-ResNet) for distinguishing isolated atypical GN from SADC in CT images. The malignancy, rotation, and morphology labels are integrated into a chimeric label and utilized to pre-train a ResNet50. The pre-trained ResNet50 is then transferred and fine-tuned to predict the malignancy of SPN. Two image datasets of 428 subjects (Dataset1, 307; Dataset2, 121) from different hospitals are collected. Dataset1 is divided into training, validation, and test data by a ratio of 7:1:2 to develop the model. Dataset2 is utilized as an external validation dataset. RESULTS:CLSSL-ResNet achieves an area under the ROC curve (AUC) of 0.944 and an accuracy (ACC) of 91.3%, which was much higher than that of the consensus of two experienced chest radiologists (77.3%). CLSSL-ResNet also outperforms other self-supervised learning models and many counterparts of other backbone networks. In Dataset2, AUC and ACC of CLSSL-ResNet are 0.923 and 89.3%, respectively. Additionally, the ablation experiment result indicates higher efficiency of the chimeric label. CONCLUSION:CLSSL with morphology labels can increase the ability of feature representation by deep networks. As a non-invasive method, CLSSL-ResNet can distinguish GN from SADC via CT images and may support clinical diagnoses after further validation.
Keywords: Lung adenocarcinoma, granulomatous nodules, chimeric label, self-supervised learning, deep learning, classification
DOI: 10.3233/XST-230063
Journal: Journal of X-Ray Science and Technology, vol. 31, no. 5, pp. 981-999, 2023
IOS Press, Inc.
6751 Tepper Drive
Clifton, VA 20124
USA
Tel: +1 703 830 6300
Fax: +1 703 830 2300
[email protected]
For editorial issues, like the status of your submitted paper or proposals, write to [email protected]
IOS Press
Nieuwe Hemweg 6B
1013 BG Amsterdam
The Netherlands
Tel: +31 20 688 3355
Fax: +31 20 687 0091
[email protected]
For editorial issues, permissions, book requests, submissions and proceedings, contact the Amsterdam office [email protected]
Inspirees International (China Office)
Ciyunsi Beili 207(CapitaLand), Bld 1, 7-901
100025, Beijing
China
Free service line: 400 661 8717
Fax: +86 10 8446 7947
[email protected]
For editorial issues, like the status of your submitted paper or proposals, write to [email protected]
如果您在出版方面需要帮助或有任何建, 件至: [email protected]