An open API service indexing awesome lists of open source software.

https://github.com/cpp-lln-lab/bidsMReye

BIDS app using deepMReye to decode eye motion for fMRI time series data.
https://github.com/cpp-lln-lab/bidsMReye

bids bidsapp closember eye-tracking

Last synced: 25 days ago
JSON representation

BIDS app using deepMReye to decode eye motion for fMRI time series data.

Awesome Lists containing this project

README

        

[![System tests](https://github.com/cpp-lln-lab/bidsMReye/actions/workflows/system_tests.yml/badge.svg?branch=main)](https://github.com/cpp-lln-lab/bidsMReye/actions/workflows/system_tests.yml)
[![Test and coverage](https://github.com/cpp-lln-lab/bidsMReye/actions/workflows/test_and_coverage.yml/badge.svg)](https://github.com/cpp-lln-lab/bidsMReye/actions/workflows/test_and_coverage.yml)
[![codecov](https://codecov.io/gh/cpp-lln-lab/bidsMReye/branch/main/graph/badge.svg?token=G5fm2kaloM)](https://codecov.io/gh/cpp-lln-lab/bidsMReye)
[![Documentation Status](https://readthedocs.org/projects/bidsmreye/badge/?version=latest)](https://bidsmreye.readthedocs.io/en/latest/?badge=latest)
[![License](https://img.shields.io/badge/license-GPL3-blue.svg)](./LICENSE)
[![PyPI version](https://badge.fury.io/py/bidsmreye.svg)](https://badge.fury.io/py/bidsmreye)
![PyPI - Python Version](https://img.shields.io/pypi/pyversions/bidsmreye)
![https://github.com/psf/black](https://img.shields.io/badge/code%20style-black-000000.svg)
[![Sourcery](https://img.shields.io/badge/Sourcery-enabled-brightgreen)](https://sourcery.ai)
[![All Contributors](https://img.shields.io/badge/all_contributors-2-orange.svg)](#contributors)
[![paper doi](https://img.shields.io/badge/paper-10.1038%2Fs41593--021--00947--w-blue)](https://doi.org/10.1038/s41593-021-00947-w)
[![zenodo DOI](https://zenodo.org/badge/DOI/10.5281/zenodo.7493322.svg)](https://doi.org/10.5281/zenodo.7493322)

# bidsMReye

BIDS app for decoding gaze position from the eyeball MR-signal using
[deepMReye](https://github.com/DeepMReye/DeepMReye)
([1](https://doi.org/10.1038/s41593-021-00947-w)).

To be used on preprocessed BIDS derivatives (e.g.
[fMRIprep](https://github.com/nipreps/fmriprep) outputs).
No eye-tracking data required.

By default, bidsMReye uses a [pre-trained version](https://osf.io/mrhk9/) of
[deepMReye](https://github.com/DeepMReye/DeepMReye) trained on 5 datasets incl.
guided fixations ([2](https://doi.org/10.1038/sdata.2017.181)), smooth pursuit
([3](https://doi.org/10.1016/j.neuroimage.2018.04.012),[4](https://doi.org/10.1101/2021.08.03.454928),[5](https://doi.org/10.1038/s41593-017-0050-8))
and free viewing ([6](https://doi.org/10.1038/s41593-017-0049-1)). Other
pretrained versions are optional. Dedicated model training is recommended.

The pipeline automatically extracts the eyeball voxels.
This can be used also for other multivariate pattern
analyses in the absence of eye-tracking data.
Decoded gaze positions allow computing eye movements.

Some basic quality control and outliers detection is also performed:

- for each run

![](https://github.com/cpp-lln-lab/bidsMReye/blob/main/docs/source/images/sub-01_task-auditory_space-MNI152NLin6Asym_desc-bidsmreye_eyetrack.png)

- at the group level

![](https://github.com/cpp-lln-lab/bidsMReye/blob/main/docs/source/images/group_eyetrack.png)

For more information, see the
[User Recommendations](https://deepmreye.slite.com/p/channel/MUgmvViEbaATSrqt3susLZ/notes/kKdOXmLqe).
If you have other questions, please reach out to the developer team.

## Install

Better to use the docker image as there are known install issues
of deepmreye on Apple M1 for example.

### Docker

#### Build

```bash
docker build --tag cpplab/bidsmreye:latest --file docker/Dockerfile .
```

#### Pull

Pull the latest docker image:

```bash
docker pull cpplab/bidsmreye:latest
```

### Python package

You can also get the package from pypi if you want.

```bash
pip install bidsmreye
```

#### Conda installation

**NOT TESTED YET**

To encapsulate bidsMReye in a virtual environment install with the following commands:

```bash
conda create --name bidsmreye python=3.10
conda activate bidsmreye
conda install pip
pip install bidsmreye
```

The tensorflow dependency supports both CPU and GPU instructions.

Note that you might need to install cudnn first

```bash
conda install -c conda-forge cudnn
```

### Dev install

Clone this repository.

```bash
git clone git://github.com/cpp-lln-lab/bidsmreye
```

Then install the package:

```bash
cd bidsMReye
make install_dev
```

## Usage

## Requirements

bidsmreye requires your input fmri data:

- to be minimally preprocessed (at least realigned),
- with filenames and structure that conforms to a BIDS derivative dataset.

Two bids apps are available to generate those types of preprocessed data:

- [fmriprep](https://fmriprep.org/en/stable/)
- [bidspm](https://bidspm.readthedocs.io/en/latest/general_information.html)

Obviousvly your fmri data must include the eyes of your participant for bidsmreye to work.

### CLI

Type the following for more information:

```bash
bidsmreye --help
```

## Preparing the data

`prepapre` means that bidsmreye will extract the data coming from the
eyes from the fMRI images.

If your data is not in MNI space, bidsmreye will also register the data to MNI.

```bash
bidsmreye bids_dir output_dir participant prepare
```

## Computing the eye movements

`generalize` use the extracted timeseries to predict the eye movements
using the default pre-trained model of deepmreye.

This will also generate a quality control report of the decoded eye movements.

```bash
bidsmreye bids_dir output_dir participant generalize
```
## Doing it all at once

`all` does "prepare" then "generalize".

```bash
bidsmreye bids_dir output_dir participant all
```

## Group level summary

```
bidsmreye bids_dir output_dir group qc
```

## Demo

Please look up the [documentation](https://bidsmreye.readthedocs.io/en/latest/demo.html)

## Contributors ✨

Thanks goes to these wonderful people
([emoji key](https://allcontributors.org/docs/en/emoji-key)):



Pauline Cabee
Pauline Cabee

💻 🤔 🚇
Remi Gau
Remi Gau

💻 🤔 ⚠️ 🚧
Ying Yang
Ying Yang

🐛 📓

This project follows the
[all-contributors](https://github.com/all-contributors/all-contributors)
specification. Contributions of any kind welcome!

If you train [deepMReye](https://github.com/DeepMReye/DeepMReye), or if you have
eye-tracking training labels and the extracted eyeball voxels, consider sharing
it to contribute to the [pretrained model pool](https://osf.io/mrhk9/).