Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/reshalfahsi/eeg-motor-imagery-classification
EEG Motor Imagery Classification Using CNN, Transformer, and MLP
https://github.com/reshalfahsi/eeg-motor-imagery-classification
eeg-classification mne-python motor-imagery-classification
Last synced: about 2 months ago
JSON representation
EEG Motor Imagery Classification Using CNN, Transformer, and MLP
- Host: GitHub
- URL: https://github.com/reshalfahsi/eeg-motor-imagery-classification
- Owner: reshalfahsi
- Created: 2023-06-26T10:14:49.000Z (over 1 year ago)
- Default Branch: master
- Last Pushed: 2024-03-13T03:57:40.000Z (10 months ago)
- Last Synced: 2024-03-13T04:43:23.308Z (10 months ago)
- Topics: eeg-classification, mne-python, motor-imagery-classification
- Language: Jupyter Notebook
- Homepage:
- Size: 4.69 MB
- Stars: 12
- Watchers: 3
- Forks: 3
- Open Issues: 1
-
Metadata Files:
- Readme: README.md
- Citation: CITATION.cff
Awesome Lists containing this project
README
# EEG Motor Imagery Classification Using CNN, Transformer, and MLP
An illustration of the CNN-Transformer-MLP model.
The electroencephalogram, or EEG for short, is one of the biosignals that display brain activity in the form of time-series data. EEG can be used to help amputees or paralyzed people move their prosthetic arms via a brain-computer interface (BCI). In order to identify the correct limbs to control from the EEG signal, a combination of CNN, Transformer, and MLP is utilized in this work for motor imagery (MI) classification. CNN converts the epoched EEG signal into meaningful representation in accordance with the signal's non-stationary nature. Transformer finds the global relationship of the given representation from CNN. MLP classifies the expected upper limbs to move based on the extracted information from the Transformer. To gauge the capability of the CNN-Transformer-MLP model, PhysioNet's EEG Motor Movement/Imagery Dataset is used. The model attains an accuracy of ``76.4%`` on the test set.
## Experiment
To run the experiment, [click here](https://github.com/reshalfahsi/eeg-motor-imagery-classification/blob/master/EEG_Motor_Imagery_Classification_Using_CNN_Transformer_and_MLP.ipynb).
## Result
### Quantitative Result
To quantitatively validate the capability of the CNN-Transformer-MLP model, certain evaluation metrics are employed: accuracy and loss. Accuracy measures how many times the model makes a correct prediction in a particular split of the dataset. Loss quantifies how close the prediction is to the actual label. The loss calculation is utilized in the training stage as well. In this work, the binary cross-entropy (BCE) loss is adopted for the loss function.Dataset Split | Accuracy | Loss
------------ | ------------- | -------------
Train | 82.9% | **0.201**
Validation | **84.4%** | 0.269
Test | 76.4% | 0.856### Accuracy and Loss Curve
Accuracy curve on the train set and the validation set.
Loss curve on the train set and the validation set.### Qualitative Result
Here, the qualitative performance of the model is presented.
Correct prediction on the right arm class.
Correct prediction on the left arm class.
False prediction on the left arm class.## Citation
If you think this repository is helpful for your research, you may cite it:
```
@misc{eegal-fahsi,
title = {EEG Motor Imagery Classification Using CNN, Transformer, and MLP},
url = {https://github.com/reshalfahsi/eeg-motor-imagery-classification},
author = {Resha Dwika Hefni Al-Fahsi},
}
```## Credit
- [EEG Motor Movement/Imagery Dataset](https://physionet.org/content/eegmmidb/1.0.0/)
- [Electroencephalogram Signal Classification for action identification](https://keras.io/examples/timeseries/eeg_signal_classification/)
- [Timeseries Classification With a Transformer Model](https://keras.io/examples/timeseries/timeseries_classification_transformer/)
- [EEG Classification](https://github.com/DavidSilveraGabriel/EEG-classification/blob/master/Using_mne_and_braindecode.ipynb)
- [Loading EEG Data](https://neuro.inf.unibe.ch/AlgorithmsNeuroscience/Tutorial_files/DataLoading.html)
- [MNE-Python](https://mne.tools/stable/glossary.html)
- [Self-Attention and Positional Encoding](https://d2l.ai/chapter_attention-mechanisms-and-transformers/self-attention-and-positional-encoding.html)