Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/louisguitton/find-ur-rythm
Collect EEG data and use machine learning to validate the impact of different musics on the brain
https://github.com/louisguitton/find-ur-rythm
Last synced: about 1 month ago
JSON representation
Collect EEG data and use machine learning to validate the impact of different musics on the brain
- Host: GitHub
- URL: https://github.com/louisguitton/find-ur-rythm
- Owner: louisguitton
- Created: 2016-03-09T20:17:49.000Z (over 8 years ago)
- Default Branch: master
- Last Pushed: 2016-03-22T15:47:39.000Z (over 8 years ago)
- Last Synced: 2024-10-09T20:38:52.962Z (about 1 month ago)
- Language: Python
- Homepage:
- Size: 35.7 MB
- Stars: 1
- Watchers: 3
- Forks: 1
- Open Issues: 0
-
Metadata Files:
- Readme: Readme.md
Awesome Lists containing this project
README
# Musique et Activité Cérébrale
## Papers
This project is based on the papers contained in this folder## Data
Summary of the dataset:
- each recording is 40 min long
- which is composed of 2 recordings of 20 min
- 10 songs of 2 min make one recordingThe data is of the [h5 format](http://docs.h5py.org/en/latest/quick.html)
| Left Recording | Zone | Meaning |
| -------------- |:--------------:| ----------------:|
| signal_0 | T3 | auditoral cortex |
| signal_1 | C3 | motor cortex |
| signal_2 | CZ | Vertex |
| signal_3 | nothing | nothing |
| signal_4 | nothing | nothing || Right Recording| Zone | Meaning |
| -------------- |:--------------:| ----------------:|
| signal_0 | T4 | auditoral cortex |
| signal_1 | C4 | motor cortex |
| signal_2 | F4 | frontal cortex |
| signal_3 | E2 | Eyes |
| signal_4 | nothing | nothing |## Data Processing in dataset_generator.py
We manually find with script.py the beginning of the music in the data
Start of music for Louis = 34532
Start of music for Charles = 21189Then dataset_generator.py cuts the 40 min recordings into the 20 songs.
Each 20 min recording is an array of 10 songs.
Each song is a dictionary of the signals. The keys are the zones.You can access the recordings later by calling
```python
from dataset_generator import *
```## Frequency analysis
Our choice was to use multitapering for the FFT of the signals.
For more details ask [Charles Masson](https://github.com/CharlesMasson)## Unsupervised study
See the scripts unsupervised.py and unsupervised_rythmic.py
The frequency powers of the 10 songs are plotted in order to look patterns in the data.
The resulting plots are in the folder images.## Supervised study
Sklearn is used to try to see if we can predict if a song is rythmic or not.
The script is also interactive in the command prompt.