The following study was conducted by Scientists from The Jackson Laboratory for Genomic Medicine, Farmington, CT, USA; Computational Sciences PhD Program, University of Massachusetts-Boston, Boston, MA, USA; Department of Pathology, Yale University School of Medicine, New Haven, CT, USA; Department of Information Systems, Boston University, Boston, MA, USA; Department of Mathematics, University of Massachusetts-Boston, Boston, MA, USA; UCONN Health, Department of Genetics and Genome Sciences, Farmington, CT, USA. Study is published in Nature Communications Journal as detailed below.
Nature Communications; Volume 11, Article Number: 6367 (2020)
Deep Learning-Based Cross-Classifications Reveal Conserved Spatial Behaviors within Tumor Histological Images
Abstract
Histopathological images are a rich but incompletely explored data type for studying cancer. Manual inspection is time consuming, making it challenging to use for image data mining. Here we show that convolutional neural networks (CNNs) can be systematically applied across cancer types, enabling comparisons to reveal shared spatial behaviors. We develop CNN architectures to analyze 27,815 hematoxylin and eosin scanned images from The Cancer Genome Atlas for tumor/normal, cancer subtype, and mutation classification. Our CNNs are able to classify TCGA pathologist-annotated tumor/normal status of whole slide images (WSIs) in 19 cancer types with consistently high AUCs (0.995 ± 0.008), as well as subtypes with lower but significant accuracy (AUC 0.87 ± 0.1). Remarkably, tumor/normal CNNs trained on one tissue are effective in others (AUC 0.88 ± 0.11), with classifier relationships also recapitulating known adenocarcinoma, carcinoma, and developmental biology. Moreover, classifier comparisons reveal intra-slide spatial similarities, with an average tile-level correlation of 0.45 ± 0.16 between classifier pairs. Breast cancers, bladder cancers, and uterine cancers have spatial patterns that are particularly easy to detect, suggesting these cancers can be canonical types for image analysis. Patterns for TP53 mutations can also be detected, with WSI self- and cross-tissue AUCs ranging from 0.65-0.80. Finally, we comparatively evaluate CNNs on 170 breast and colon cancer images with pathologist-annotated nuclei, finding that both cellular and intercellular regions contribute to CNN accuracy. These results demonstrate the power of CNNs not only for histopathological classification, but also for cross-comparisons to reveal conserved spatial behaviors across tumors.
Source:
Nature Communications
URL: https://www.nature.com/articles/s41467-020-20030-5
Citation:
Noorbakhsh, J., Farahmand, S., Foroughi pour, A. et al. Deep learning-based cross-classifications reveal conserved spatial behaviors within tumor histological images. Nat Commun 11, 6367 (2020). https://doi.org/10.1038/s41467-020-20030-5