In this paper, a hybrid-domain deep learning (DL)-based neural system is proposed to decode hand movement preparation phases from electroencephalographic (EEG) recordings. The system exploits information extracted from the temporal-domain and time-frequency-domain, as part of a hybrid strategy, to discriminate the temporal windows (i.e. EEG epochs) preceding hand sub-movements (open/close) and the resting state. To this end, for each EEG epoch, the associated cortical source signals in the motor cortex and the corresponding time-frequency (TF) maps are estimated via beamforming and Continuous Wavelet Transform (CWT), respectively. Two Convolutional Neural Networks (CNNs) are designed: specifically, the first CNN is trained over a dataset of temporal (T) data (i.e. EEG sources), and is referred to as T-CNN; the second CNN is trained over a dataset of TF data (i.e. TF-maps of EEG sources), and is referred to as TF-CNN. Two sets of features denoted as T-features and TF-features, extracted from T-CNN and TF-CNN, respectively, are concatenated in a single features vector (denoted as TTF-features vector) which is used as input to a standard multi-layer perceptron for classification purposes. Experimental results show a significant performance improvement of our proposed hybrid-domain DL approach as compared to temporal-only and time-frequency-only-based benchmark approaches, achieving an average accuracy of 76.21 ± 3.77%.

A Hybrid-Domain Deep Learning-Based BCI for Discriminating Hand Motion Planning from EEG Sources / Ieracitano, C.; Morabito, F. C.; Hussain, A.; Mammone, N.. - In: INTERNATIONAL JOURNAL OF NEURAL SYSTEMS. - ISSN 0129-0657. - 31:9(2021), p. 2150038. [10.1142/S0129065721500386]

A Hybrid-Domain Deep Learning-Based BCI for Discriminating Hand Motion Planning from EEG Sources

Ieracitano C.
Membro del Collaboration Group
;
Morabito F. C.
Membro del Collaboration Group
;
Mammone N.
Membro del Collaboration Group
2021-01-01

Abstract

In this paper, a hybrid-domain deep learning (DL)-based neural system is proposed to decode hand movement preparation phases from electroencephalographic (EEG) recordings. The system exploits information extracted from the temporal-domain and time-frequency-domain, as part of a hybrid strategy, to discriminate the temporal windows (i.e. EEG epochs) preceding hand sub-movements (open/close) and the resting state. To this end, for each EEG epoch, the associated cortical source signals in the motor cortex and the corresponding time-frequency (TF) maps are estimated via beamforming and Continuous Wavelet Transform (CWT), respectively. Two Convolutional Neural Networks (CNNs) are designed: specifically, the first CNN is trained over a dataset of temporal (T) data (i.e. EEG sources), and is referred to as T-CNN; the second CNN is trained over a dataset of TF data (i.e. TF-maps of EEG sources), and is referred to as TF-CNN. Two sets of features denoted as T-features and TF-features, extracted from T-CNN and TF-CNN, respectively, are concatenated in a single features vector (denoted as TTF-features vector) which is used as input to a standard multi-layer perceptron for classification purposes. Experimental results show a significant performance improvement of our proposed hybrid-domain DL approach as compared to temporal-only and time-frequency-only-based benchmark approaches, achieving an average accuracy of 76.21 ± 3.77%.
2021
beamforming
brain-computer interface
Deep learning
electroencephalography
feature fusion
wavelet transform
Algorithms
Electroencephalography
Machine Learning
Neural Networks, Computer
Brain-Computer Interfaces
Deep Learning
File in questo prodotto:
File Dimensione Formato  
A Hybrid-Domain Deep Learning-Based BCI For Discriminating_2021_IJNS_PRE-PROOF.pdf

accesso aperto

Tipologia: Documento in Post-print
Licenza: Tutti i diritti riservati (All rights reserved)
Dimensione 9.15 MB
Formato Adobe PDF
9.15 MB Adobe PDF Visualizza/Apri

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/20.500.12318/129428
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 53
  • ???jsp.display-item.citation.isi??? 42
social impact