The present paper introduces a novel method to decode imagined movement from electroencephalographic (EEG) signals. Decoding the imagined movement with good accuracy is a challenging topic in motor imagery (MI) BCIs, poor accuracy may indeed hinder the application of such systems in practice. The present paper introduces an extension of the well-established Filter Bank Common Spatial Patterns (FBCSP) algorithm, named AutoEncoder(AE)-FBCSP, to benefit from the ability of AE to learn how to map data from the feature space onto a latent space where information relevant for classification are is embedded. The proposed method is based on a global (cross-subject) and subsequent transfer learning subject-specific (intra-subject) approach. A multi-way extension of AE-FBCSP is also introduced in this paper. The proposed methodology consists of recording high-density EEG (64 electrodes). Features are extracted by means of FBCSP and used to train a custom AE, in an unsupervised way, to project the features into a compressed latent space. Latent features then are used to train a supervised classifier (feed forward neural network) to decode the imagined movement. The algorithm was tested using a dataset of EEG extracted from a publicly available database of data collected from 109 subjects. AE-FBCSP was extensively tested in the 3-way (right-hand vs left-hand motor imagery vs resting) classification and also in the 2-way, 4-way and 5-way ones, both in cross- and intra-subject analysis. AE-FBCSP outperformed standard FBCSP in a statistically significant way (p < 0.05) and outperformed also comparable methods in the literature applied to the same dataset. AE-FBCSP achieved an average accuracy of 89.09% in the 3-way subject-specific classification. With AE-FBCSP, 71.43% of subjects achieved a very high accuracy (> 87.68%) whereas no subject achieved an accuracy > 87.68% with FBCSP. One of the most interesting outcomes is that AE-FBCSP remarkably increased the number of subjects that responded with a very high accuracy, which is a fundamental requirement for BCI systems to be applied in practice.

AutoEncoder Filter Bank Common Spatial Patterns to decode Motor Imagery from EEG

Mammone N.
Membro del Collaboration Group
;
Ieracitano C.
Membro del Collaboration Group
;
Morabito F. C.
Membro del Collaboration Group
2023-01-01

Abstract

The present paper introduces a novel method to decode imagined movement from electroencephalographic (EEG) signals. Decoding the imagined movement with good accuracy is a challenging topic in motor imagery (MI) BCIs, poor accuracy may indeed hinder the application of such systems in practice. The present paper introduces an extension of the well-established Filter Bank Common Spatial Patterns (FBCSP) algorithm, named AutoEncoder(AE)-FBCSP, to benefit from the ability of AE to learn how to map data from the feature space onto a latent space where information relevant for classification are is embedded. The proposed method is based on a global (cross-subject) and subsequent transfer learning subject-specific (intra-subject) approach. A multi-way extension of AE-FBCSP is also introduced in this paper. The proposed methodology consists of recording high-density EEG (64 electrodes). Features are extracted by means of FBCSP and used to train a custom AE, in an unsupervised way, to project the features into a compressed latent space. Latent features then are used to train a supervised classifier (feed forward neural network) to decode the imagined movement. The algorithm was tested using a dataset of EEG extracted from a publicly available database of data collected from 109 subjects. AE-FBCSP was extensively tested in the 3-way (right-hand vs left-hand motor imagery vs resting) classification and also in the 2-way, 4-way and 5-way ones, both in cross- and intra-subject analysis. AE-FBCSP outperformed standard FBCSP in a statistically significant way (p < 0.05) and outperformed also comparable methods in the literature applied to the same dataset. AE-FBCSP achieved an average accuracy of 89.09% in the 3-way subject-specific classification. With AE-FBCSP, 71.43% of subjects achieved a very high accuracy (> 87.68%) whereas no subject achieved an accuracy > 87.68% with FBCSP. One of the most interesting outcomes is that AE-FBCSP remarkably increased the number of subjects that responded with a very high accuracy, which is a fundamental requirement for BCI systems to be applied in practice.
2023
AutoEncoders
Bioinformatics
Brain Computer Interface
Deep Learning
EEG
Electroencephalography
Feature extraction
Filter banks
Motor Imagery
Transfer learning
File in questo prodotto:
Non ci sono file associati a questo prodotto.

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/20.500.12318/136889
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 12
  • ???jsp.display-item.citation.isi??? 9
social impact