Searching for just a few words should be enough to get started. If you need to make more complex queries, use the tips below to guide you.
Article type: Research Article
Authors: Ferre, Romualda; | Elst, Jannea | Senthilnathan, Seanthana | Lagree, Andrewb | Tabbarah, Samib | Lu, Fang-Ic | Sadeghi-Naini, Alid | Tran, William T.b; e; f | Curpen, Belindaa; g
Affiliations: [a] Department of Medical Imaging, Sunnybrook Health Sciences Centre, Toronto, ON, Canada | [b] Department of Radiation Oncology, Sunnybrook Health Sciences Centre, Toronto, ON, Canada | [c] Department of Laboratory Medicine and Molecular Diagnostics, Sunnybrook Health Sciences Centre, Toronto, ON, Canada | [d] Department of Electrical Engineering and Computer Science, York University, Toronto, ON, Canada | [e] Department of Radiation Oncology, University of Toronto, Toronto, ON, Canada | [f] Temerty Centre for AI Research and Education, University of Toronto, ON, Canada | [g] Department of Medical Imaging, University of Toronto, Toronto, ON, Canada
Correspondence: [*] Corresponding author: Romuald Ferre, Department of Medical Imaging, Sunnybrook Health Sciences Centre, 2075 Bayview Avenue, Toronto, ON, M4N 3M5, Canada. Tel.: +1 (416) 480 6130. E-mail: [email protected]
Abstract: OBJECTIVES:Early diagnosis of triple-negative (TN) and human epidermal growth factor receptor 2 positive (HER2+) breast cancer is important due to its increased risk of micrometastatic spread necessitating early treatment and for guiding targeted therapies. This study aimed to evaluate the diagnostic performance of machine learning (ML) classification of newly diagnosed breast masses into TN versus non-TN (NTN) and HER2+ versus HER2 negative (HER2−) breast cancer, using radiomic features extracted from grayscale ultrasound (US) b-mode images. MATERIALS AND METHODS:A retrospective chart review identified 88 female patients who underwent diagnostic breast US imaging, had confirmation of invasive malignancy on pathology and receptor status determined on immunohistochemistry available. The patients were classified as TN, NTN, HER2+ or HER2− for ground-truth labelling. For image analysis, breast masses were manually segmented by a breast radiologist. Radiomic features were extracted per image and used for predictive modelling. Supervised ML classifiers included: logistic regression, k-nearest neighbour, and Naïve Bayes. Classification performance measures were calculated on an independent (unseen) test set. The area under the receiver operating characteristic curve (AUC), sensitivity (%), and specificity (%) were reported for each classifier. RESULTS:The logistic regression classifier demonstrated the highest AUC: 0.824 (sensitivity: 81.8%, specificity: 74.2%) for the TN sub-group and 0.778 (sensitivity: 71.4%, specificity: 71.6%) for the HER2 sub-group. CONCLUSION:ML classifiers demonstrate high diagnostic accuracy in classifying TN versus NTN and HER2+ versus HER2− breast cancers using US images. Identification of more aggressive breast cancer subtypes early in the diagnostic process could help achieve better prognoses by prioritizing clinical referral and prompting adequate early treatment.
Keywords: Machine learning, ultrasound, triple negative breast cancer, HER2+ breast cancer
DOI: 10.3233/BD-220018
Journal: Breast Disease, vol. 42, no. 1, pp. 59-66, 2023
IOS Press, Inc.
6751 Tepper Drive
Clifton, VA 20124
USA
Tel: +1 703 830 6300
Fax: +1 703 830 2300
[email protected]
For editorial issues, like the status of your submitted paper or proposals, write to [email protected]
IOS Press
Nieuwe Hemweg 6B
1013 BG Amsterdam
The Netherlands
Tel: +31 20 688 3355
Fax: +31 20 687 0091
[email protected]
For editorial issues, permissions, book requests, submissions and proceedings, contact the Amsterdam office [email protected]
Inspirees International (China Office)
Ciyunsi Beili 207(CapitaLand), Bld 1, 7-901
100025, Beijing
China
Free service line: 400 661 8717
Fax: +86 10 8446 7947
[email protected]
For editorial issues, like the status of your submitted paper or proposals, write to [email protected]
如果您在出版方面需要帮助或有任何建, 件至: [email protected]