In this study a Deep Learning (DL) based-Brain-Computer Interface (BCI) system able to automatically detect and decode voluntary eye blinks from the analysis of electroen-cephalographic (EEG) signals is proposed for controlling, in principle, an external device by means of ocular movements. To this end, a Convolutional Neural Network (CNN) is developed to classify EEG recordings related to natural (or involuntary) blinks, forced (or voluntary) blinks and baseline (no blinks) category. The proposed system achieved an impressive average classification performance: accuracy rate up to 99.4% +/- 1.3%. However, the core of the present study was to investigate the explainability and interpretability of the proposed CNN with the ultimate aim of explore which segments of the EEG signal is the most relevant in the voluntary/involuntary blink discrimination process. To this end, explainable Artificial Intelligence (xAI) techniques were applied. Specifically, the Gradient-weighted Class Activation Mapping (Grad-CAM) and the Local Interpretable Model Agnostic Explanation (LIME) algorithms were used. xAI allowed us to visually identify the most relevant EEG areas especially for the voluntary and involuntary blink detection. Indeed, limited to the analyzed dataset, for natural blinks, the discriminating region was the interval ranged from the temporal instant the eye was closed till the following instants of the reopening (vice-versa for voluntary blinks). The baseline (no blink), on the other hand, was characterized by a low activation threshold throughout the EEG segment.
Visual Explanations of Deep Convolutional Neural Network for eye blinks detection in EEG-based BCI applications / Giudice, M. L.; Mammone, N.; Ieracitano, C.; Campolo, M.; Bruna, A. R.; Tomaselli, V.; Morabito, F. C.. - 2022-:(2022), pp. 01-08. (Intervento presentato al convegno 2022 International Joint Conference on Neural Networks, IJCNN 2022 tenutosi a ita nel 2022) [10.1109/IJCNN55064.2022.9892567].
Visual Explanations of Deep Convolutional Neural Network for eye blinks detection in EEG-based BCI applications
Mammone N.Membro del Collaboration Group
;Ieracitano C.Membro del Collaboration Group
;Campolo M.Membro del Collaboration Group
;Morabito F. C.Membro del Collaboration Group
2022-01-01
Abstract
In this study a Deep Learning (DL) based-Brain-Computer Interface (BCI) system able to automatically detect and decode voluntary eye blinks from the analysis of electroen-cephalographic (EEG) signals is proposed for controlling, in principle, an external device by means of ocular movements. To this end, a Convolutional Neural Network (CNN) is developed to classify EEG recordings related to natural (or involuntary) blinks, forced (or voluntary) blinks and baseline (no blinks) category. The proposed system achieved an impressive average classification performance: accuracy rate up to 99.4% +/- 1.3%. However, the core of the present study was to investigate the explainability and interpretability of the proposed CNN with the ultimate aim of explore which segments of the EEG signal is the most relevant in the voluntary/involuntary blink discrimination process. To this end, explainable Artificial Intelligence (xAI) techniques were applied. Specifically, the Gradient-weighted Class Activation Mapping (Grad-CAM) and the Local Interpretable Model Agnostic Explanation (LIME) algorithms were used. xAI allowed us to visually identify the most relevant EEG areas especially for the voluntary and involuntary blink detection. Indeed, limited to the analyzed dataset, for natural blinks, the discriminating region was the interval ranged from the temporal instant the eye was closed till the following instants of the reopening (vice-versa for voluntary blinks). The baseline (no blink), on the other hand, was characterized by a low activation threshold throughout the EEG segment.I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.