Brain computer interfaces (BCIs) provide new opportunities for individuals with motor disabilities to interact with their environment. One challenge for BCI systems is to produce control commands for robotics quickly and accurately. This study proposes using four types of electrooculography (EOG) signals, including left and right winks, voluntary and involuntary eye blinks, to control a remote robotic system. Voluntary and involuntary eye blinks are differentiated to avoid unintended commands. The system uses tinyML algorithm to analyze and interpret the EOG signals in real-time, adapting to resource-limited settings. The proposed system includes an event detection algorithm to select signal segments and a 1D CNN for classification. Our BCI solution is entirely embedded on a custom-made board with dedicated components and a STM32L476RG microcontroller unit (MCU) which takes care of the whole process without the need of any external device. The system enables three degrees of freedom control (e.g., a robotic platform) with 99.3% average classification accuracy for the four classes of EOG signals. Multiple users have tested the system, reporting high accuracy and ease of use when controlling a three-wheeled robot.

An Embedded EOG-based Brain Computer Interface System for Robotic Control / Chepyk, O.; Bruna, A.; Campolo, M.; Mammone, N.; Morabito, F. C.; Ruggeri, G.; Tomaselli, V.. - (2023). (Intervento presentato al convegno 8th International Conference on Smart and Sustainable Technologies, SpliTech 2023 tenutosi a University of Split, Faculty of Electrical Engineering, Mechanical Engineering and Naval Architecture (FESB) and Hotel Elaphusa, hrv nel 2023) [10.23919/SpliTech58164.2023.10193265].

An Embedded EOG-based Brain Computer Interface System for Robotic Control

Chepyk O.;Campolo M.;Mammone N.;Morabito F. C.;Ruggeri G.;
2023-01-01

Abstract

Brain computer interfaces (BCIs) provide new opportunities for individuals with motor disabilities to interact with their environment. One challenge for BCI systems is to produce control commands for robotics quickly and accurately. This study proposes using four types of electrooculography (EOG) signals, including left and right winks, voluntary and involuntary eye blinks, to control a remote robotic system. Voluntary and involuntary eye blinks are differentiated to avoid unintended commands. The system uses tinyML algorithm to analyze and interpret the EOG signals in real-time, adapting to resource-limited settings. The proposed system includes an event detection algorithm to select signal segments and a 1D CNN for classification. Our BCI solution is entirely embedded on a custom-made board with dedicated components and a STM32L476RG microcontroller unit (MCU) which takes care of the whole process without the need of any external device. The system enables three degrees of freedom control (e.g., a robotic platform) with 99.3% average classification accuracy for the four classes of EOG signals. Multiple users have tested the system, reporting high accuracy and ease of use when controlling a three-wheeled robot.
2023
1D-CNN
EOG Blink/Wink Classifier
Low Energy
Low-Cost
MCU
noninvasive BCI
Real-time
Reliable
TinyMl
File in questo prodotto:
Non ci sono file associati a questo prodotto.

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/20.500.12318/144557
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 0
  • ???jsp.display-item.citation.isi??? ND
social impact