This paper presents a novel Mixture of Experts (MoE)-based framework designed to enhance clinical decision-making by balancing predictive accuracy, interpretability, and adaptability. Our approach relies on a set of locally specialized logistic regression models, dynamically selecting the most suitable expert for each instance based on local feature patterns. To enforce sparsity in both the gating mechanism and expert models, we employ the Gumbel-softmax relaxation, enabling end-to-end differentiable selection of both the active expert and the most relevant features for each prediction. By integrating this mechanism, our method improves computational efficiency and generalization while maintaining instance-level interpretability. Unlike black-box models that require post hoc explanation techniques, our solution provides transparency by construction, offering direct insights into feature contributions for each decision. We evaluated our approach on multiple real-world healthcare datasets, spanning both standard clinical classification tasks and process-oriented predictive scenarios. Experimental results demonstrated that our MoE-based framework achieves robust and competitive performance while maintaining lower complexity than black-box methods such as XGBoost and Random Forest and improving generalization over simpler interpretable models, including Decision Trees, Linear Trees, and standard Logistic Regressors. Additionally, our analysis of the trade-off between model complexity and predictive performance shows that our method delivers stable and reliable results across diverse datasets and evaluation metrics. These findings underscore the advantages of an interpretable MoE-based approach in clinical AI, supporting transparent and accountable decision-making.

Toward trustworthy and sustainable clinical decision support by training ensembles of specialized logistic regressors

Cuzzocrea, Alfredo
;
Folino, Francesco;Pontieri, Luigi;Sabatino, Pietro;Samami, Maryam
2025-01-01

Abstract

This paper presents a novel Mixture of Experts (MoE)-based framework designed to enhance clinical decision-making by balancing predictive accuracy, interpretability, and adaptability. Our approach relies on a set of locally specialized logistic regression models, dynamically selecting the most suitable expert for each instance based on local feature patterns. To enforce sparsity in both the gating mechanism and expert models, we employ the Gumbel-softmax relaxation, enabling end-to-end differentiable selection of both the active expert and the most relevant features for each prediction. By integrating this mechanism, our method improves computational efficiency and generalization while maintaining instance-level interpretability. Unlike black-box models that require post hoc explanation techniques, our solution provides transparency by construction, offering direct insights into feature contributions for each decision. We evaluated our approach on multiple real-world healthcare datasets, spanning both standard clinical classification tasks and process-oriented predictive scenarios. Experimental results demonstrated that our MoE-based framework achieves robust and competitive performance while maintaining lower complexity than black-box methods such as XGBoost and Random Forest and improving generalization over simpler interpretable models, including Decision Trees, Linear Trees, and standard Logistic Regressors. Additionally, our analysis of the trade-off between model complexity and predictive performance shows that our method delivers stable and reliable results across diverse datasets and evaluation metrics. These findings underscore the advantages of an interpretable MoE-based approach in clinical AI, supporting transparent and accountable decision-making.
2025
Clinical DSSs
Green AI
Machine learning
Outcome prediction
XAI
File in questo prodotto:
Non ci sono file associati a questo prodotto.

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/20.500.11770/388160
 Attenzione

Attenzione! I dati visualizzati non sono stati sottoposti a validazione da parte dell'ateneo

Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 2
  • ???jsp.display-item.citation.isi??? ND
social impact