Ye S J, Wang Y X. Graph neural network-based WSI cancer survival prediction method[J]. Opto-Electron Eng, 2024, 51(4): 240011. doi: 10.12086/oee.2024.240011
Citation: Ye S J, Wang Y X. Graph neural network-based WSI cancer survival prediction method[J]. Opto-Electron Eng, 2024, 51(4): 240011. doi: 10.12086/oee.2024.240011

Graph neural network-based WSI cancer survival prediction method

    Fund Project: Project supported by Natural Science Foundation of Shanghai (22ZR1443700)
More Information
  • Whole slide imaging (WSI) is the main basis for cancer diagnosis and prognosis, characterized by its large size, complex spatial relationships, and diverse styles. Due to its lack of detailed annotations, traditional computational pathology methods are difficult to handle WSI tasks. To address these challenges, this paper proposes a WSI survival prediction model based on graph neural networks, BC-GraphSurv. Specifically, we use transfer learning pre-training to extract features containing spatial relationship information and construct the pathological relationship topology of WSI. Then, the two branch structures of the improved graph attention network (GAT) and graph convolution network (GCN) are used to predict the extracted features. We combine edge attributes and global perception modules in GAT, while the GCN branch is used to supplement local details, which can achieve adaptability to WSI style differences and effectively utilize topological structures to handle spatial relationships and distinguish subtle pathological environments. Experimental results on the TCGA-BRCA dataset demonstrate BC-GraphSurv's effectiveness, achieving a C-index of 0.795—a significant improvement of 0.0409 compared to current state-of-the-art survival prediction models. This underscores its robust efficacy in addressing WSI challenges in cancer diagnosis and prognosis.
  • 加载中
  • [1] Li X T, Li C, Rahaman M M, et al. A comprehensive review of computer-aided whole-slide image analysis: from datasets to feature extraction, segmentation, classification and detection approaches[J]. Artif Intell Rev, 2022, 55(6): 4809−4878. doi: 10.1007/s10462-021-10121-0

    CrossRef Google Scholar

    [2] 黄盼, 何鹏, 杨兴, 等. 基于自适应融合和显微成像的乳腺肿瘤分级网络[J]. 光电工程, 2023, 50(1): 220158. doi: 10.12086/oee.2023.220158

    CrossRef Google Scholar

    Huang P, He P, Yang X, et al. Breast tumor grading network based on adaptive fusion and microscopic imaging[J]. Opto-Electron Eng, 2023, 50(1): 220158. doi: 10.12086/oee.2023.220158

    CrossRef Google Scholar

    [3] Chan T H, Cendra F J, Ma L, et al. Histopathology whole slide image analysis with heterogeneous graph representation learning[C]//Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, Vancouver, Canada, 2023: 15661–15670. https://doi.org/10.1109/CVPR52729.2023.01503.

    Google Scholar

    [4] 梁礼明, 卢宝贺, 龙鹏威, 等. 自适应特征融合级联Transformer视网膜血管分割算法[J]. 光电工程, 2023, 50(10): 230161. doi: 10.12086/oee.2023.230161

    CrossRef Google Scholar

    Liang L M, Lu B H, Long P W, et al. Adaptive feature fusion cascade Transformer retinal vessel segmentation algorithm[J]. Opto-Electron Eng, 2023, 50(10): 230161. doi: 10.12086/oee.2023.230161

    CrossRef Google Scholar

    [5] 吕佳, 王泽宇, 梁浩城. 边界注意力辅助的动态图卷积视网膜血管分割[J]. 光电工程, 2023, 50(1): 220116. doi: 10.12086/oee.2023.220116

    CrossRef Google Scholar

    Lv J, Wang Z Y, Liang H C. Boundary attention assisted dynamic graph convolution for retinal vascular segmentation[J]. Opto-Electron Eng, 2023, 50(1): 220116. doi: 10.12086/oee.2023.220116

    CrossRef Google Scholar

    [6] Tellez D, Litjens G, van der Laak J, et al. Neural image compression for gigapixel histopathology image analysis[J]. IEEE Trans Pattern Anal Mach Intell, 2021, 43(2): 567−578. doi: 10.1109/TPAMI.2019.2936841

    CrossRef Google Scholar

    [7] Zhu X L, Yao J W, Huang J Z. Deep convolutional neural network for survival analysis with pathological images[C]//2016 IEEE International Conference on Bioinformatics and Biomedicine, Shenzhen, China, 2016: 544–547. https://doi.org/10.1109/BIBM.2016.7822579.

    Google Scholar

    [8] Zhu X L, Yao J W, Zhu F Y, et al. WSISA: making survival prediction from whole slide histopathological images[C]//Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Honolulu, HI, USA, 2017: 6855–6863. https://doi.org/10.1109/CVPR.2017.725.

    Google Scholar

    [9] Yao J W, Zhu X L, Jonnagaddala J, et al. Whole slide images based cancer survival prediction using attention guided deep multiple instance learning networks[J]. Med Image Anal, 2020, 65: 101789. doi: 10.1016/j.media.2020.101789

    CrossRef Google Scholar

    [10] Fan L, Sowmya A, Meijering E, et al. Cancer survival prediction from whole slide images with self-supervised learning and slide consistency[J]. IEEE Trans Med Imaging, 2023, 42(5): 1401−1412. doi: 10.1109/TMI.2022.3228275

    CrossRef Google Scholar

    [11] Kim H E, Cosa-Linan A, Santhanam N, et al. Transfer learning for medical image classification: a literature review[J]. BMC Med Imaging, 2022, 22(1): 69. doi: 10.1186/s12880-022-00793-7

    CrossRef Google Scholar

    [12] Li R Y, Yao J W, Zhu X L, et al. Graph CNN for survival analysis on whole slide pathological images[C]//21st International Conference on Medical Image Computing and Computer-Assisted Intervention, Granada, Spain, 2018: 174–182. https://doi.org/10.1007/978-3-030-00934-2_20.

    Google Scholar

    [13] Chen R J, Lu M Y, Shaban M, et al. Whole slide images are 2D point clouds: context-aware survival prediction using patch-based graph convolutional networks[C]//24th International Conference on Medical Image Computing and Computer Assisted Intervention, Strasbourg, France, 2021: 339–349. https://doi.org/10.1007/978-3-030-87237-3_33.

    Google Scholar

    [14] Di D L, Zhang J, Lei F Q, et al. Big-Hypergraph factorization neural network for survival prediction from whole slide image[J]. IEEE Trans Image Process, 2022, 31: 1149−1160. doi: 10.1109/TIP.2021.3139229

    CrossRef Google Scholar

    [15] Velickovic P, Cucurull G, Casanova A, et al. Graph attention networks[C]//6th International Conference on Learning Representations, Vancouver, Canada, 2018: 1–12.

    Google Scholar

    [16] Ozyoruk K B, Can S, Darbaz B, et al. A deep-learning model for transforming the style of tissue images from cryosectioned to formalin-fixed and paraffin-embedded[J]. Nat Biomed Eng, 2022, 6(12): 1407−1419. doi: 10.1038/s41551-022-00952-9

    CrossRef Google Scholar

    [17] Kipf T N, Welling M. Semi-supervised classification with graph convolutional networks[C]//5th International Conference on Learning Representations, Toulon, France, 2017: 1–14.

    Google Scholar

    [18] Gao Z Y, Lu Z Y, Wang J, et al. A convolutional neural network and graph convolutional network based framework for classification of breast histopathological images[J]. IEEE J Biomed Health Inform, 2022, 26(7): 3163−3173. doi: 10.1109/JBHI.2022.3153671

    CrossRef Google Scholar

    [19] Katzman J L, Shaham U, Cloninger A, et al. DeepSurv: personalized treatment recommender system using a Cox proportional hazards deep neural network[J]. BMC Med Res Methodol, 2018, 18(1): 24. doi: 10.1186/s12874-018-0482-1

    CrossRef Google Scholar

    [20] Kandoth C, McLellan M D, Vandin F, et al. Mutational landscape and significance across 12 major cancer types[J]. Nature, 2013, 502(7471): 333−339. doi: 10.1038/nature12634

    CrossRef Google Scholar

    [21] Spanhol F A, Oliveira L S, Petitjean C, et al. A dataset for breast cancer histopathological image classification[J]. IEEE Trans Biomed Eng, 2016, 63(7): 1455−1462. doi: 10.1109/TBME.2015.2496264

    CrossRef Google Scholar

    [22] Otsu N. A threshold selection method from gray-level histograms[J]. IEEE Trans Syst Man Cybern, 1979, 9(1): 62−66. doi: 10.1109/TSMC.1979.4310076

    CrossRef Google Scholar

    [23] Heagerty P J, Zheng Y Y. Survival model predictive accuracy and ROC curves[J]. Biometrics, 2005, 61(1): 92−105. doi: 10.1111/j.0006-341X.2005.030814.x

    CrossRef Google Scholar

    [24] Loshchilov I, Hutter F. Decoupled weight decay regularization[C]//7th International Conference on Learning Representations, New Orleans, LA, USA, 2019: 1–8.

    Google Scholar

    [25] Smith L N, Topin N. Super-convergence: very fast training of neural networks using large learning rates[J]. Proc SPIE, 2019, 11006: 1100612. doi: 10.1117/12.2520589

    CrossRef Google Scholar

    [26] Ilse M, Tomczak J, Welling M. Attention-based deep multiple instance learning[C]//35th International Conference on Machine Learning, Stockholm, Sweden, 2018: 2127–2136.

    Google Scholar

    [27] Hou W T, Yu L Q, Lin C X, et al. H2-MIL: exploring hierarchical representation with heterogeneous multiple instance learning for whole slide image analysis[C]//Proceedings of the 36th AAAI Conference on Artificial Intelligence, 2022: 933–941. https://doi.org/10.1609/aaai.v36i1.19976.

    Google Scholar

    [28] Lee Y, Park J H, Oh S, et al. Derivation of prognostic contextual histopathological features from whole-slide images of tumours via graph deep learning[J]. Nat Biomed Eng, 2022. https://doi.org/10.1038/s41551-022-00923-0.

    Google Scholar

    [29] Chen R J, Lu M Y, Weng W H, et al. Multimodal co-attention transformer for survival prediction in gigapixel whole slide images[C]//Proceedings of the IEEE/CVF International Conference on Computer Vision, Montreal, QC, Canada, 2021: 3995–4005. https://doi.org/10.1109/ICCV48922.2021.00398.

    Google Scholar

    [30] Wang Z F, Sun J M. SurvTRACE: transformers for survival analysis with competing events[C]//Proceedings of the 13th ACM International Conference on Bioinformatics, Computational Biology and Health Informatics, Northbrook, IL, USA, 2022: 49. https://doi.org/10.1145/3535508.3545521.

    Google Scholar

    [31] Lakatos E. Sample sizes based on the log-rank statistic in complex clinical trials[J]. Biometrics, 1988, 44(1): 229−241. doi: 10.2307/2531910

    CrossRef Google Scholar

  • In this study, we present BC-GraphSurv, an innovative model for breast cancer survival prediction utilizing Whole Slide Imaging (WSI). Given the challenges of large size, complex spatial relationships, and diverse styles in WSIs, BC-GraphSurv addresses these issues through a novel approach that integrates transfer learning and feature extraction using the HF-Net. The model consists of four steps: transfer learning with HF-Net, compression and fusion of similar features, construction of graph structure features, and learning with WA-GAT and MP-GCN. The model commences with a transfer learning pre-training strategy, utilizing HF-Net to construct the pathological relationship topology of WSIs. This strategy facilitates the effective extraction of features and spatial relationship information. HF-Net, trained on a breast cancer tumor classification dataset, is crucial for adapting a general backbone network to the complexity of tumor structures and tissue texture features. This network reduces noise in non-cancerous regions and enhances differentiation between cancerous and non-cancerous areas. The feature extraction network, combining Convolutional Neural Networks (CNN) and self-attention mechanisms, benefits from transfer learning to enhance pathology feature recognition via a feature transfer module. This module, coupled with spatial correlation and semantic similarity integration, enables compressed graph modeling and extraction of crucial contextual features for survival prediction. To overcome specific challenges in WSI tasks, BC-GraphSurv introduces improvements to the Graph Attention Network (GAT) in the form of the Whole Association Graph Attention Network (WA-GAT). This prediction branch employs cross-attention on node and edge features, a global perception module, and a Dense Graph Convolutional Network (GCN) for fine-grained details. The integration of WA-GAT and GCN enhances the model's adaptability to diverse WSI styles and spatial differences, effectively processing spatial information and improving analytical capabilities. Experimental validation involves ablation experiments assessing the impact of different modules and improvements. Comparative experiments with various models and visual analyses confirm the effectiveness of BC-GraphSurv. In conclusion, BC-GraphSurv provides a comprehensive solution for breast cancer survival prediction using WSIs. Experimental results on the TCGA-BRCA dataset showcase its effectiveness, with a consistency index of 0.795, surpassing current state-of-the-art models. The model's innovations effectively tackle the challenges inherent in WSI survival prediction, demonstrating robustness and superiority.

  • 加载中
通讯作者: 陈斌, bchen63@163.com
  • 1. 

    沈阳化工大学材料科学与工程学院 沈阳 110142

  1. 本站搜索
  2. 百度学术搜索
  3. 万方数据库搜索
  4. CNKI搜索

Figures(7)

Tables(4)

Article Metrics

Article views() PDF downloads() Cited by()

Access History

Other Articles By Authors

Article Contents

Catalog

    /

    DownLoad:  Full-Size Img  PowerPoint