We have located links that may give you full text access.
Multi-Scale Spatiotemporal Attention Network for Neuron based Motor Imagery EEG Classification.
Journal of Neuroscience Methods 2024 March 29
BACKGROUND: In recent times, the expeditious expansion of Brain-Computer Interface (BCI) technology in neuroscience, which relies on electroencephalogram (EEG) signals associated with motor imagery, has yielded outcomes that rival conventional approaches, notably due to the triumph of deep learning. Nevertheless, the task of developing and training a comprehensive network to extract the underlying characteristics of motor imagining EEG data continues to pose challenges.
NEW METHOD: This paper presents a multi-scale spatiotemporal self-attention (SA) network model that relies on an attention mechanism. This model aims to classify motor imagination EEG signals into four classes (left hand, right hand, foot, tongue/rest) by considering the temporal and spatial properties of EEG. It is employed to autonomously allocate greater weights to channels linked to motor activity and lesser weights to channels not related to movement, thus choosing the most suitable channels. Neuron utilises parallel multi-scale Temporal Convolutional Network (TCN) layers to extract feature information in the temporal domain at various scales, effectively eliminating temporal domain noise.
RESULTS: The suggested model achieves accuracies of 79.26%, 85.90%, and 96.96% on the BCI competition datasets IV-2a, IV-2b, and HGD, respectively.
COMPARISON WITH EXISTING METHODS: In terms of single-subject classification accuracy, this strategy demonstrates superior performance compared to existing methods.
CONCLUSION: The results indicate that the proposed strategy exhibits favourable performance, resilience, and transfer learning capabilities.
NEW METHOD: This paper presents a multi-scale spatiotemporal self-attention (SA) network model that relies on an attention mechanism. This model aims to classify motor imagination EEG signals into four classes (left hand, right hand, foot, tongue/rest) by considering the temporal and spatial properties of EEG. It is employed to autonomously allocate greater weights to channels linked to motor activity and lesser weights to channels not related to movement, thus choosing the most suitable channels. Neuron utilises parallel multi-scale Temporal Convolutional Network (TCN) layers to extract feature information in the temporal domain at various scales, effectively eliminating temporal domain noise.
RESULTS: The suggested model achieves accuracies of 79.26%, 85.90%, and 96.96% on the BCI competition datasets IV-2a, IV-2b, and HGD, respectively.
COMPARISON WITH EXISTING METHODS: In terms of single-subject classification accuracy, this strategy demonstrates superior performance compared to existing methods.
CONCLUSION: The results indicate that the proposed strategy exhibits favourable performance, resilience, and transfer learning capabilities.
Full text links
Related Resources
Trending Papers
Renin-Angiotensin-Aldosterone System: From History to Practice of a Secular Topic.International Journal of Molecular Sciences 2024 April 5
Prevention and treatment of ischaemic and haemorrhagic stroke in people with diabetes mellitus: a focus on glucose control and comorbidities.Diabetologia 2024 April 17
British Society for Rheumatology guideline on management of adult and juvenile onset Sjögren disease.Rheumatology 2024 April 17
Albumin: a comprehensive review and practical guideline for clinical use.European Journal of Clinical Pharmacology 2024 April 13
Get seemless 1-tap access through your institution/university
For the best experience, use the Read mobile app
All material on this website is protected by copyright, Copyright © 1994-2024 by WebMD LLC.
This website also contains material copyrighted by 3rd parties.
By using this service, you agree to our terms of use and privacy policy.
Your Privacy Choices
You can now claim free CME credits for this literature searchClaim now
Get seemless 1-tap access through your institution/university
For the best experience, use the Read mobile app