Among the XAI (eXplainable Artificial Intelligence) techniques, local explanations are witnessing increasing interest due to the user need to trust specific black-box decisions. In this work we explore a novel local explanation approach appliable to any kind of classifier based on generating masking models. The idea underlying the method is to learn a transformation of the input leading to a novel instance able to confuse the black-box and simultaneously minimizing dissimilarity with the instance to explain. The transformed instance then highlights the parts of the input that need to be (de-)emphasized and acts as an explanation for the local decision. We clarify differences with existing local explanation methods and experiment our approach on different image classification scenarios, pointing out advantages and peculiarities of the proposal.

Finding Local Explanations Through Masking Models

Angiulli F.;Fassetti F.;Nistico' S.
2021-01-01

Abstract

Among the XAI (eXplainable Artificial Intelligence) techniques, local explanations are witnessing increasing interest due to the user need to trust specific black-box decisions. In this work we explore a novel local explanation approach appliable to any kind of classifier based on generating masking models. The idea underlying the method is to learn a transformation of the input leading to a novel instance able to confuse the black-box and simultaneously minimizing dissimilarity with the instance to explain. The transformed instance then highlights the parts of the input that need to be (de-)emphasized and acts as an explanation for the local decision. We clarify differences with existing local explanation methods and experiment our approach on different image classification scenarios, pointing out advantages and peculiarities of the proposal.
2021
978-3-030-91607-7
978-3-030-91608-4
Deep learning
eXplainable Artificial Intelligence
Local explanations for machine learning
File in questo prodotto:
Non ci sono file associati a questo prodotto.

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/20.500.11770/346000
 Attenzione

Attenzione! I dati visualizzati non sono stati sottoposti a validazione da parte dell'ateneo

Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 0
  • ???jsp.display-item.citation.isi??? ND
social impact