Searching for just a few words should be enough to get started. If you need to make more complex queries, use the tips below to guide you.
Article type: Research Article
Authors: Yang, Fana; b; 1 | Weng, Xina; b; 1 | Wu, Yuhuic | Miao, Yuehonga; b | Lei, Pingguic; * | Hu, Zuquana; b; d; *
Affiliations: [a] School of Biology & Engineering (School of Health Medicine Modern Industry), Guizhou Medical University, Guiyang, Guizhou Province, China | [b] Immune Cells and Antibody Engineering Research Center of Guizhou Province, Key Laboratory of Biology and Medical Engineering, Guizhou Medical University, Guiyang, Guizhou Province, China | [c] Department of Radiology, The Affiliated Hospital of Guizhou Medical University, Guiyang, Guizhou Province, China | [d] Key Laboratory of Infectious Immune and Antibody Engineering of Guizhou Province, Engineering Research Center of Cellular Immunotherapy of Guizhou Province, Guizhou Medical University, Guiyang, Guizhou Province, China
Correspondence: [*] Corresponding authors: Pinggui Lei, Department of Radiology, The Affiliated Hospital of Guizhou Medical University, Guiyang, Guizhou Province, China. E-mail: [email protected] and Zuquan Hu, School of Biology & Engineering (School of Health Medicine Modern Industry), Guizhou Medical University, Guiyang, Guizhou Province, China. E-mail: [email protected].
Note: [1] These authors contributed equally to this work.
Abstract: BACKGROUND:Ulna and radius segmentation of dual-energy X-ray absorptiometry (DXA) images is essential for measuring bone mineral density (BMD). OBJECTIVE:To develop and test a novel deep learning network architecture for robust and efficient ulna and radius segmentation on DXA images. METHODS:This study used two datasets including 360 cases. The first dataset included 300 cases that were randomly divided into five groups for five-fold cross-validation. The second dataset including 60 cases was used for independent testing. A deep learning network architecture with dual residual dilated convolution module and feature fusion block based on residual U-Net (DFR-U-Net) to enhance segmentation accuracy of ulna and radius regions on DXA images was developed. The Dice similarity coefficient (DSC), Jaccard, and Hausdorff distance (HD) were used to evaluate the segmentation performance. A one-tailed paired t-test was used to assert the statistical significance of our method and the other deep learning-based methods (P < 0.05 indicates a statistical significance). RESULTS:The results demonstrated our method achieved the promising segmentation performance, with DSC of 98.56±0.40% and 98.86±0.25%, Jaccard of 97.14±0.75% and 97.73±0.48%, and HD of 6.41±11.67 pixels and 8.23±7.82 pixels for segmentation of ulna and radius, respectively. According to statistics data analysis results, our method yielded significantly higher performance than other deep learning-based methods. CONCLUSIONS:The proposed DFR-U-Net achieved higher segmentation performance for ulna and radius on DXA images than the previous work and other deep learning approaches. This methodology has potential to be applied to ulna and radius segmentation to help doctors measure BMD more accurately in the future
Keywords: DXA images, deep learning, ulna and radius segmentation, feature fusion block, dual residual dilated convolution module
DOI: 10.3233/XST-230010
Journal: Journal of X-Ray Science and Technology, vol. 31, no. 3, pp. 641-653, 2023
IOS Press, Inc.
6751 Tepper Drive
Clifton, VA 20124
USA
Tel: +1 703 830 6300
Fax: +1 703 830 2300
[email protected]
For editorial issues, like the status of your submitted paper or proposals, write to [email protected]
IOS Press
Nieuwe Hemweg 6B
1013 BG Amsterdam
The Netherlands
Tel: +31 20 688 3355
Fax: +31 20 687 0091
[email protected]
For editorial issues, permissions, book requests, submissions and proceedings, contact the Amsterdam office [email protected]
Inspirees International (China Office)
Ciyunsi Beili 207(CapitaLand), Bld 1, 7-901
100025, Beijing
China
Free service line: 400 661 8717
Fax: +86 10 8446 7947
[email protected]
For editorial issues, like the status of your submitted paper or proposals, write to [email protected]
如果您在出版方面需要帮助或有任何建, 件至: [email protected]