Weakly supervised learning and interpretability for endometrial whole slide image diagnosis
Abstract
Fully supervised learning for whole slide image based diagnostic tasks in histopathology is problematic due to the requirement for costly and time-consuming manual annotation by experts. Weakly supervised learning which utilises only slide-level labels during training is becoming more widespread as it relieves this burden, but has not yet been applied to endometrial whole slide images, in iSyntax format. In this work we apply a weakly supervised learning algorithm to a real-world dataset of this type for the first time, with over 85% validation accuracy and over 87% test accuracy. We then employ interpretability methods including attention heatmapping, feature visualisation, and a novel end-to-end saliency-mapping approach to identify distinct morphologies learned by the model and build an understanding of its behaviour. These interpretability methods, alongside consultation with expert pathologists, allow us to make comparisons between machine-learned knowledge and consensus in the field. This work contributes to the state of the art by demonstrating a robust practical application of weakly supervised learning on a real-world digital pathology dataset and shows the importance of fine-grained interpretability to support understanding and evaluation of model performance in this high-stakes use case.
Citation
Mohammadi , M , Cooper , J , Arandelovic , O , Fell , C M , Morrison , D , Syed , S , Konanahalli , P , Bell , S , Bryson , G , Harrison , D J & Harris-Birtill , D C C 2022 , ' Weakly supervised learning and interpretability for endometrial whole slide image diagnosis ' , Experimental Biology and Medicine , vol. 247 , no. 22 , pp. 2025 - 2037 . https://doi.org/10.1177/15353702221126560
Publication
Experimental Biology and Medicine
Status
Peer reviewed
ISSN
1535-3702Type
Journal article
Rights
Copyright © 2022 by the Society for Experimental Biology and Medicine. This paper is published under the Creative Commons Attribution 4.0 International (CC BY 4.0) license.
Description
Funding: This work is supported by the Industrial Centre for AI Research in digital Diagnostics (iCAIRD) which is funded by Innovate UK on behalf of UK Research and Innovation (UKRI) [project number: 104690], and in part by Chief Scientist Office, Scotland.Collections
Items in the St Andrews Research Repository are protected by copyright, with all rights reserved, unless otherwise indicated.