PT Unknown AU Ahmed M. A. Salih Ilaria Boscolo Galazzo Zahra Zahra Raisi-Estabragh Steffen E. Petersen Polyxeni Gkontra Karim Lekadir Gloria Menegaz Petia Radeva TI A new scheme for the assessment of the robustness of Explainable Methods Applied to Brain Age estimation BT 34th International Symposium on Computer-Based Medical Systems PY 2021 BP 492 EP 497 DI 10.1109/CBMS52027.2021.00098 AB Deep learning methods show great promise in a range of settings including the biomedical field. Explainability of these models is important in these fields for building end-user trust and to facilitate their confident deployment. Although several Machine Learning Interpretability tools have been proposed so far, there is currently no recognized evaluation standard to transfer the explainability results into a quantitative score. Several measures have been proposed as proxies for quantitative assessment of explainability methods. However, the robustness of the list of significant features provided by the explainability methods has not been addressed. In this work, we propose a new proxy for assessing the robustness of the list of significant features provided by two explainability methods. Our validation is defined at functionality-grounded level based on the ranked correlation statistical index and demonstrates its successful application in the framework of brain aging estimation. We assessed our proxy to estimate brain age using neuroscience data. Our results indicate small variability and high robustness in the considered explainability methods using this new proxy. ER