The COVID-19 disease caused by the SARS-CoV-2 virus first appeared in Wuhan, China, and is considered a serious disease due to its high permeability, and contagiousness. The similarity of COVID-19 disease with other lung infections, along with its high spreading rate, makes the diagnosis difficult. Solutions based on machine learning techniques achieved relevant results in identifying the correct disease and providing early diagnosis, and can hence provide significant clinical decision support; however, such approaches suffer from the lack of proper means for interpreting the choices made by the models, especially in case of deep learning ones. With the aim to improve interpretability and explainability in the process of making qualified decisions, we designed a system that allows a partial opening of this black box by means of proper investigations on the rationale behind the decisions. We tested our approach over artificial neural networks trained for multiple classification based on Chest X-ray images; our tool analyzed the internal processes performed by the networks during the classification tasks to identify the most important elements involved in the training process that influence the network’s decisions. We report the results of an experimental analysis aimed at assessing the viability of the proposed approach.

Understanding automatic COVID-19 classification using chest X-ray images

Bruno P.;Marte C.;Calimeri F.
2020

Abstract

The COVID-19 disease caused by the SARS-CoV-2 virus first appeared in Wuhan, China, and is considered a serious disease due to its high permeability, and contagiousness. The similarity of COVID-19 disease with other lung infections, along with its high spreading rate, makes the diagnosis difficult. Solutions based on machine learning techniques achieved relevant results in identifying the correct disease and providing early diagnosis, and can hence provide significant clinical decision support; however, such approaches suffer from the lack of proper means for interpreting the choices made by the models, especially in case of deep learning ones. With the aim to improve interpretability and explainability in the process of making qualified decisions, we designed a system that allows a partial opening of this black box by means of proper investigations on the rationale behind the decisions. We tested our approach over artificial neural networks trained for multiple classification based on Chest X-ray images; our tool analyzed the internal processes performed by the networks during the classification tasks to identify the most important elements involved in the training process that influence the network’s decisions. We report the results of an experimental analysis aimed at assessing the viability of the proposed approach.
Chest X-ray images
Convolutional Neural Networks
GradCAM
File in questo prodotto:
Non ci sono file associati a questo prodotto.

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: http://hdl.handle.net/20.500.11770/333138
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 0
  • ???jsp.display-item.citation.isi??? ND
social impact