Explainable Deep Learning for Medical Imaging Classification
Date
2024Author
Subject
Metadata
Show full item recordAbstract
Machine learning is increasingly being applied to medical imaging tasks. However, the "black box'' nature of techniques such as deep learning has inhibited the interpretability and trustworthiness of these methods, and therefore their clinical utility. In recent years, explainability methods have been developed to allow better interrogation of these approaches.
This thesis presents the novel application of explainable deep learning to several medical imaging tasks, to investigate its potential in patient safety and research. It presents the novel application of explainable deep learning to the detection of aneurysm clips in CT brains for MRI safety. It also presents the novel application of explainable deep learning to the detection of confounding pathology in radiology report texts for dataset curation. Furthermore, it makes novel contributions to Parkinson’s research, using explainable deep learning to identify progressive brain changes in MRI brain scans, and to identify differences in the brains of non-manifesting carriers of Parkinson's genetic risk variants in MRI brain scans. In each case, convolutional neural networks were developed for classification of data, and Shapley Additive exPlanations (SHAP) were used to explain predictions. A novel pipeline was developed to apply SHAP to volumetric medical imaging data.
The application of explainable deep learning to various types of data and task demonstrates the flexibility of the combination of convolutional neural networks and SHAP. Additionally, these applications highlight the importance of combining explainability with clinical expertise, to check the viability of the models and to ensure that they meet a clinical need. These novel applications represent useful new tools for safety and research, and potentially for improvement of clinical care.
Collections
Publisher
Commissioning body
The following license files are associated with this item: