J Shanghai Jiaotong Univ Sci ›› 2024, Vol. 29 ›› Issue (6): 945-957.doi: 10.1007/s12204-024-2743-y
• • 下一篇
柯晶1,朱俊超2,杨鑫1,张浩林3,孙宇翔1,王嘉怡1,鲁亦舟4,沈逸卿5,刘晟6,蒋伏松7,黄琴8
接受日期:
2023-10-23
出版日期:
2024-11-28
发布日期:
2024-11-28
KE Jing1*(柯晶), ZHU Junchao2 (朱俊超), YANG Xin1(杨鑫), ZHANG Haolin3 (张浩林), SUN Yuxiang1(孙宇翔), WANG Jiayi1(王嘉怡), LU Yizhou4(鲁亦舟), SHEN Yiqing5(沈逸卿), LIU Sheng6*(刘晟), JIANG Fusong7(蒋伏松), HUANG Qin8*(黄琴)
Accepted:
2023-10-23
Online:
2024-11-28
Published:
2024-11-28
摘要: 通过甲状腺细针穿刺(FNA)可以评估癌症风险,获得预后信息,并指导后续护理或手术。生物检验数字化和深度学习技术推动了计算病理学的发展。然而,目前仍然缺乏,可以与医生基本水平相匹配的,针对复杂细胞病理学图像的系统性诊断系统。研究中,我们设计了一个深度学习框架,用于定量评估甲状腺细针穿刺图像的癌症风险,该框架名为TshFNA-Examiner。在TshFNA-Examiner中,通过细胞核分割神经网络检测与诊断医学信息强相关的细胞密集区域;通过分类神经网络按照报告甲状腺细胞病理学(TBSRTC)系统对细胞级图像子块进行分类,同时使用半监督网络基于未标记数据对分类网络进行增强。研究了从2019年到2022年收集的333例甲状腺细针穿刺样本,分为I到VI级,并完成了像素级和图像级的图像子块标注。通过综合指标和多个任务评估了TshFNA-Examiner,以证明其优于最先进的深度学习方法。细胞区域分割的平均性能达到了0.931 Dice系数和0.871 Jaccard指数。癌症风险分类器按照TBSRTC标准达到了0.959 Macro-F1-score、0.998 Macro-AUC和0.959准确率。通过利用大量未标记数据进行半监督学习,相应的指标可以提高至0.970 Macro-F1-score、0.999 Macro-AUC和0.970 准确率。在临床实践中,TshFNA-Examiner可以帮助细胞学家以便捷方式可视化深度学习网络的输出,以促进最终决策的制定。
中图分类号:
柯晶1, 朱俊超2, 杨鑫1, 张浩林3, 孙宇翔1, 王嘉怡1, 鲁亦舟4, 沈逸卿5, 刘晟6, 蒋伏松7, 黄琴8. TshFNA-Examiner:甲状腺细胞学图像的核分割和癌症评估框架[J]. J Shanghai Jiaotong Univ Sci, 2024, 29(6): 945-957.
KE Jing1(柯晶), ZHU Junchao2 (朱俊超), YANG Xin1(杨鑫), ZHANG Haolin3 (张浩林), SUN Yuxiang1(孙宇翔), WANG Jiayi1(王嘉怡), LU Yizhou4(鲁亦舟), SHEN Yiqing5(沈逸卿), LIU Sheng6(刘晟), JIANG Fusong7(蒋伏松), HUANG Qin8(黄琴). TshFNA-Examiner: A Nuclei Segmentation and Cancer Assessment Framework for Thyroid Cytology Image[J]. J Shanghai Jiaotong Univ Sci, 2024, 29(6): 945-957.
[1] SOCIETY AC. Key statistics for thyroid cancer [EB/OL]. [2023-08-17]. https://www.cancer.org/cancer/thyroid-cancer/about/key-statistics.html [2] PAPINI E. Risk of malignancy in nonpalpable thyroid nodules: Predictive value of ultrasound and colordoppler features [J]. Journal of Clinical Endocrinology&Metabolism, 2002, 87(5): 1941-1946. [3] CIBAS E S, ALI S Z. The Bethesda system for reporting thyroid cytopathology [J]. Thyroid, 2009, 19(11):1159-1165. [4] ROSSI E D, BONGIOVANNI M. Molecular cytology application on thyroid [M]//Molecular applications in cytology. Cham: Springer, 2018: 179-204. [5] RUCHALA M, SZCZEPANEK-PARULSKA E. Novel methods of diagnostics of thyroid and parathyroid lesions [M]. Basel: MDPI, 2022. [6] SRINIDHI C L, CIGA O, MARTEL A L. Deep neural network models for computational histopathology: Asurvey [J]. Medical Image Analysis, 2021, 67: 101813. [7] ABDULJABBAR K, CONSORTIUM T, RAZA S E A, et al. Geospatial immune variability illuminates differential evolution of lung adenocarcinoma [J]. Nature Medicine, 2020, 26(7): 1054-1062. [8] CORREDOR G, WANG X X, ZHOU Y, et al. Spatial architecture and arrangement of tumor-infiltrating lymphocytes for predicting likelihood of recurrence in early-stage non-small cell lung cancer [J]. Clinical Cancer Research, 2019, 25(5): 1526-1534. [9] ZHANG X X, ZHU X F, TANG K, et al. DDTNet: A dense dual-task network for tumor-infiltrating lymphocyte detection and segmentation in histopathological images of breast cancer [J]. Medical Image Analysis, 2022, 78: 102415. [10] GREENWALD N F, MILLER G, MOEN E, et al.Whole-cell segmentation of tissue images with human level performance using large-scale data annotation and deep learning [J]. Nature Biotechnology, 2022,40(4): 555-565. [11] SCHAUMBERG A J, JUAREZ-NICANOR W C,CHOUDHURY S J, et al. Interpretable multimodal deep learning for real-time pan-tissue pan-disease pathology search on social media [J]. Modern Pathology, 2020, 33(11): 2169-2185. [12] KIANI A, UYUMAZTURK B, RAJPURKAR P, et al. Impact of a deep learning assistant on the histopatho logic classification of liver cancer [J]. NPJ Digital Medicine, 2020, 3: 23. [13] ALBARQOUNI S, BAUR C, ACHILLES F, et al. AggNet: Deep learning from crowds for mitosis detection in breast cancer histology images [J]. IEEE Transactions on Medical Imaging, 2016, 35(5): 1313-1321. [14] SWIDERSKA-CHADAJ Z, PINCKAERS H, VAN RIJTHOVEN M, et al. Learning to detect lymphocytes in immunohistochemistry with deep learning [J]. Medical Image Analysis, 2019, 58: 101547. [15] BOEHM K M, AHERNE E A, ELLENSON L, et al. Multimodal data integration using machine learning improves risk stratification of high-grade serous ovariancancer [J]. Nature Cancer, 2022, 3: 723-733. [16] FU Y, JUNG A W, TORNE R V, et al. Pan-cancer computational histopathology reveals mutations, tumor composition and prognosis [J]. Nature Cancer, 2020, 1(8): 800-810. [17] LUCAS M, JANSEN I, VAN LEEUWEN T G, et al. Deep learning-based recurrence prediction in patients with non-muscle-invasive bladder cancer [J]. European Urology Focus, 2022, 8(1): 165-172. [18] TOKUYAMA N, SAITO A, MURAOKA R, et al. Prediction of non-muscle invasive bladder cancer recurrence using machine learning of quantitative nuclear features [J]. Modern Pathology, 2022, 35(4): 533-538. [19] COURTIOL P, MAUSSION C, MOARII M, et al. Deep learning-based classification of mesothelioma improves prediction of patient outcome [J]. Nature Medicine, 2019, 25(10): 1519-1525. [20] KATHER J N, KRISAM J, CHAROENTONG P, et al. Predicting survival from colorectal cancer histology slides using deep learning: A retrospective multicenter study [J]. PLoS Medicine, 2019, 16(1): e1002730. [21] RAO A, BARKLEY D, FRANC?A G S, et al. Exploring tissue architecture using spatial transcriptomics [J]. Nature, 2021, 596: 211-220. [22] LEWIS S M, ASSELIN-LABAT M L, NGUYEN Q, et al. Spatial omics and multiplexed imaging to explore cancer biology [J]. Nature Methods, 2021, 18(9): 997- 1012. [23] LU M Y, CHEN T Y, WILLIAMSON D F K, et al. AI-based pathology predicts origins for cancers of unknown primary [J]. Nature, 2021, 594(7861): 106-110. [24] DOV D, KOVALSKY S Z, ASSAAD S, et al. Weakly supervised instance learning for thyroid malignancy prediction from whole slide cytopathol ogy images [J]. Medical Image Analysis, 2021, 67: 101814. [25] DOV D, KOVALSKY S Z, ASSAAD S, et al. Weakly supervised instance learning for thyroid malignancy prediction from whole slide cytopathol ogy images [DB/OL]. (2019-04-26) [2023-08-17].http://arxiv.org/abs/1904.12739 [26] LIN Y J, CHAO T K, KHALIL M A, et al. Deep learning fast screening approach on cytological whole slides for thyroid cancer diagnosis [J]. Cancers, 2021, 13(15): 3891. [27] HIROKAWA M, NIIOKA H, SUZUKI A, et al. Application of deep learning as an ancillary diagnostic tool for thyroid FNA cytology [J]. Cancer Cytopathology, 2023, 131(4): 217-225. [28] YOUN I, LEE E, YOON J H, et al. Diagnosing thyroid nodules with atypia of undetermined significance/follicular lesion of undetermined significance cytology with the deep convolutional neural network [J]. Scientific Reports, 2021, 11: 20048. [29] CHI J N, WALIA E, BABYN P, et al. Thyroid nodule classification in ultrasound images by fine-tuning Shanghai Jiao Tong Univ. (Sci.), 2024, 29(6): 945-957 957 deep convolutional neural network [J]. Journal of Digital Imaging, 2017, 30(4): 477-486. [30] GUAN Q, WANG Y J, DU J J, et al. Deep learning based classification of ultrasound images for thyroid nodules: A large scale of pilot study [J]. Annals of Translational Medicine, 2019, 7(7): 137. [31] KUMAR V, WEBB J, GREGORY A, et al. Automated segmentation of thyroid nodule, gland, and cystic components from ultrasound images using deep learning [J]. IEEE Access, 2020, 8: 63482-63496. [32] HALICEK M, LITTLE J V, WANG X, et al. Optical biopsy of head and neck cancer using hyperspectral imaging and convolutional neural networks [J]. Journal of Biomedical Optics, 2019, 24(3): 036007. [33] ZHU Y, SANG Q, JIA S J, et al. Deep neural networks could differentiate Bethesda class III versus class IV/V/VI [J]. Annals of Translational Medicine, 2019, 7(11): 231. [34] PANESAR A. Machine learning and AI for healthcare: Big data for improved health outcomes [M]. Berkeley: Apress, 2019. [35] HAUGEN B R, ALEXANDER E K, BIBLE K C, et al. 2015 American thyroid association management guidelines for adult patients with thyroid nodules and differentiated thyroid cancer: The American thyroid association guidelines task force on thyroid nodules and differentiated thyroid cancer [J]. Thyroid, 2016, 26(1): 1-133. [36] BANKHEAD P, LOUGHREY M B, FERNANDEZ J A, et al. QuPath: Open source software for digital pathology image analysis [J]. Scientific Reports, 2017, 7: 16878. [37] JIANG H, ZHOU Y N, LIN Y, et al. Deep learning for computational cytology: A survey [J]. Medical Image Analysis, 2023, 84: 102691. [38] KAKUDO K, LIU Z, BYCHKOV A, et al. Thyroid FNA cytology: Differential diagnoses and pitfalls [M]. Singapore: Springer, 2019. [39] KUMAR V, ABBAS AK, ASTER JC. Robbins basic pathology [M]. 10th ed. Philadelphia: Elsevier Health Sciences, 2017. [40] KE J, SHEN Y Q, LU Y Z, et al. Quantitative analysis of abnormalities in gynecologic cytopathology with deep learning [J]. Laboratory Investigation, 2021, 101(4): 513-524. [41] LIU Z, LIN Y T, CAO Y, et al. Swin transformer: Hierarchical vision transformer using shifted windows [C]//2021 IEEE/CVF International Conference on Computer Vision. Montreal: IEEE, 2021: 9992-10002. [42] WANG W H, XIE E Z, LI X, et al. Pyramid vision transformer: A versatile backbone for dense prediction without convolutions [C]//2021 IEEE/CVF International Conference on Computer Vision. Montreal: IEEE, 2021: 548-558. [43] RIDNIK T, SHARIR G, BEN-COHEN A, et al. ML-decoder: Scalable and versatile classification head [DB/OL]. (2021-11-25) [2023-08-17]. http://arxiv.org/abs/2111.12933 [44] SELVARAJU R R, COGSWELL M, DAS A, et al. Grad-CAM: Visual explanations from deep networks ia gradient-based localization [C]//2017 IEEE International Conference on Computer Vision. Venice: IEEE, 2017: 618-626. [45] TARVAINEN A, VALPOLA H. Mean teachers are better role models: Weight-averaged consistency targets improve semi-supervised deep learning results [C]//31st Conference on Neural Information Processing Systems. Long Beach: NIPS, 2017: 1-10. [46] HE K M, GKIOXARI G, DOLLAR P, et al. Mask R-CNN [C]//2017 IEEE International Conference on Computer Vision. Venice: IEEE, 2017: 2980-2988. [47] BADRINARAYANAN V, KENDALL A, CIPOLLA R. SegNet: A deep convolutional encoder-decoder architecture for image segmentation [J]. IEEE Transactions on Pattern Analysis and Machine Intelligence, 2017, 39(12): 2481-2495. [48] ZHOU Z W, RAHMAN SIDDIQUEE M M, TAJBAKHSH N, et al. UNet++: A nested U-net architecture for medical image segmentation [M]//Deep learning in medical image analysis and multimodal learning for clinical decision support. Cham: Springer, 20018: 3-11. [49] ISENSEE F, PETERSEN J, KLEIN A, et al. nnUNet: Self-adapting framework for U-net-based medical image segmentation [DB/OL]. (2018-09-27) [2023-08-17]. http://arxiv.org/abs/1809.10486 [50] CHEN J N, LU Y Y, YU Q H, et al. TransUNet: Transformers make strong encoders for medical image segmentation [DB/OL]. (2021-02-08) [2023-08-17]. http://arxiv.org/abs/2102.04306 [51] ZHOU Y N, ONDER O F, DOU Q, et al. CIA-Net: Robust nuclei instance segmentation with contour-aware information aggregation [M]//Information processing inmedical imaging. Cham: Springer, 2019: 682-693. [52] HE K M, ZHANG X Y, REN S Q, et al. Deep residual learning for image recognition [C]//2016 IEEE Conference on Computer Vision and Pattern Recognition. Las Vegas: IEEE, 2016: 770-778. [53] XIE S N, GIRSHICK R, DOLLAR P, et al. Aggre-gated residual transformations for deep neural networks [C]//2017 IEEE Conference on Computer Vision and Pattern Recognition. Honolulu: IEEE, 2017: 5987-5995. [54] HU J, SHEN L, SUN G. Squeeze-and-excitation networks [C]//2018 IEEE/CVF Conference on Computer Vision and Pattern Recognition. Salt Lake City: IEEE,2018: 7132-7141. [55] SANDLER M, HOWARD A, ZHU M L, et al. MobileNetV2: Inverted residuals and linear bottlenecks [C]//2018 IEEE/CVF Conference on Computer Vision and Pattern Recognition. Salt Lake City: IEEE, 2018: 4510-4520. [56] LIU Z, MAO H Z, WU C Y, et al. A ConvNet for the 2020s [C]//2022 IEEE/CVF Conference on Computer Vision and Pattern Recognition. New Orleans: IEEE, 2022: 11966-11976. [57] DOSOVITSKIY A, BEYER L, KOLESNIKOV A, et al. An image is worth 16x16 words: Transformers for image recognition at scale [DB/OL]. (2020-10-22) [2023-08-17]. http://arxiv.org/abs/2010.11929 |
[1] | 李明爱1, 2, 魏丽娜1. 基于朴素卷积神经网络和线性插值的运动想像分类[J]. J Shanghai Jiaotong Univ Sci, 2024, 29(6): 958-966. |
[2] | 耿宗盛1,赵东东1, 2,周兴文1,闫磊1, 阎石1, 2. 基于全分布式事件驱动控制的多智能体系统领导-跟随一致性研究[J]. J Shanghai Jiaotong Univ Sci, 2024, 29(4): 640-645. |
[3] | 刘增敏1, 2, 3, 4, 6, 王申涛5, 姚莉秀1, 2, 3, 蔡云泽1, 2, 3, 4, 6. 基于目标检测和特征提取网络的运动无人机平台下多目标跟踪[J]. J Shanghai Jiaotong Univ Sci, 2024, 29(3): 388-399. |
[4] | 张彦军1,4,5,6,7, 王碧云2,3 , 蔡云泽1,4,5,6,7. 基于注意力的多通道网络红外弱小目标检测[J]. J Shanghai Jiaotong Univ Sci, 2024, 29(3): 414-427. |
[5] | 王玉娟1,李文刚2,刘建勇3,陈广学4,汪军1. fiber;麻灰色原配色丝织物的颜色预测模型[J]. J Shanghai Jiaotong Univ Sci, 2023, 28(6): 802-808. |
[6] | . [J]. J Shanghai Jiaotong Univ Sci, 2022, 27(5): 715-722. |
[7] | . [J]. J Shanghai Jiaotong Univ Sci, 2022, 27(5): 737-746. |
阅读次数 | ||||||||||||||||||||||||||||||||||||||||||||||||||
全文 85
|
|
|||||||||||||||||||||||||||||||||||||||||||||||||
摘要 251
|
|
|||||||||||||||||||||||||||||||||||||||||||||||||