Enhancing Motor Imagery EEG Classification in Brain-Computer Interfaces via TransformerNet


BAŞPINAR U., Tastan Y.

IEEE Access, vol.13, pp.143291-143301, 2025 (SCI-Expanded, Scopus) identifier

  • Publication Type: Article / Article
  • Volume: 13
  • Publication Date: 2025
  • Doi Number: 10.1109/access.2025.3594083
  • Journal Name: IEEE Access
  • Journal Indexes: Science Citation Index Expanded (SCI-EXPANDED), Scopus, Compendex, INSPEC, Directory of Open Access Journals
  • Page Numbers: pp.143291-143301
  • Keywords: Brain–computer interface, motor imagery, signal classification, TransformerNet
  • Marmara University Affiliated: Yes

Abstract

Brain-computer interfaces (BCIs) offer promising solutions for assisting individuals with disabilities, supporting neurorehabilitation, and enhancing human capabilities. However, the limited decoding accuracy of EEG-based motor imagery (MI) signals poses a major challenge for the practical deployment of BCI systems. A common approach involves using signals from opposite hemispheres to boost classification accuracy. Yet, to control devices such as robotic hands or prosthetics in a human-like manner, it is essential to accurately classify hand opening and closing tasks using EEG signals from the same motor cortex region. This study introduces TransformerNet, a novel deep learning architecture specifically designed to classify hand open-close MI tasks from the same brain region. The task is particularly challenging due to the high similarity and overlapping nature of the EEG signals. TransformerNet combines a convolutional module, inspired by EEGNet, to extract local spatial features, with a Transformer encoder that captures long-range temporal dependencies. Furthermore, a channel attention mechanism enhances the model's ability to focus on the most informative features. In experimental evaluations, TransformerNet achieved an average classification accuracy of 85.97%, outperforming traditional deep learning methods. The model effectively captures high-level temporal-spectral patterns and uncovers hidden dependencies within the EEG signals. These results demonstrate the potential of integrating attention mechanisms with Transformer-based architectures to improve MI-based BCI performance. This advancement holds promise for real-world applications such as brain-controlled prosthetics, assistive devices, and human-computer interaction, moving BCI technologies closer to practical and reliable implementation.