In this paper, a novel Electroepncephalography (EEG)-based Brain Computer Interface (BCI) approach is proposed to decode motion intention from EEG signals collected at the scalp of subjects performing motor execution tasks. The impact of such systems, generally based on the ability to discriminate between the imagination of right/left hand movements, would greatly benefit from the ability to decode the intention to perform sub-movements of the same limb like opening or closing the same hand. In this research, a system meant for decoding the intention to open or close the same hand is proposed. To this end, a dataset of EEG segments preceding hand open/close movement initiation as well as segments with no movement preparation (resting) was created from a public database of EEG signals recorded during upper limb motor execution experiments. Time-frequency maps were constructed for every EEG signal and used to build channel × frequency × time volumes. A system based on a custom deep Convolutional Neural Network (CNN), named EEGframeNNET was designed and developed to discriminate between pre-hand-opening, pre-hand-closing and resting. The proposed system outperformed a comparable method in the literature (TTF-NET) achieving an average accuracy of 86.5% against the 76.3% of TTF-NET. The proposed system offers a novel perspective on EEG signals evolution by projecting EEGs into a sequence of channel × frequency frames constructed by means of time-frequency analysis.
File in questo prodotto:
Non ci sono file associati a questo prodotto.