«
HEADING

Github icon

Pan-Cancer Integrative Histology-Genomic Analysis via Multimodal Deep Learning

Cancer Cell

Richard J. Chen, Ming Y. Lu, Drew F.K. Williamson, Tiffany Y. Chen, Jana Lipkova, Zahra Noor, Muhammad Shaban, Maha Shady, Mane Williams, Bumjin Joo, and Faisal Mahmood*

ArXiv | GitHub

TL;DR: We present an interpretable, weakly-supervised, multimodal deep learning algorithm that integrates whole slide images (WSIs) and molecular profile features for cancer prognosis. We validate our method on 14 cancer types in the TCGA, and extract both local and global patterns of morphological and molecular feature importances in each cancer type. Using the multimodal interpretability aspect of our model, we developed PORPOISE, an interactive, freely-available platform that directly yields prognostic markers made made by our model for thousands of patients across in our study. To validate that these model explanations are prognostic, we analyzed high attention morphological regions in WSIs, from which we discovered that tumor-infiltrating lymphocyte presence corroborates with favorable cancer prognosis on 9 out of 14 cancer types.

Understanding Attention - This demo demonstrates the High attention regions corresponds to the regions used by the model to make classification determinations. For visualization, attention scores are normalized for the entire slide.