{"id":13501456,"url":"https://github.com/simonarvin/eyeloop","last_synced_at":"2025-03-29T09:30:39.431Z","repository":{"id":56650226,"uuid":"276592908","full_name":"simonarvin/eyeloop","owner":"simonarvin","description":"EyeLoop is a Python 3-based eye-tracker tailored specifically to dynamic, closed-loop experiments on consumer-grade hardware.","archived":true,"fork":false,"pushed_at":"2022-10-20T20:51:11.000Z","size":268580,"stargazers_count":486,"open_issues_count":9,"forks_count":67,"subscribers_count":20,"default_branch":"master","last_synced_at":"2024-10-31T20:40:08.411Z","etag":null,"topics":["eye-tracking","neurology","neuroscience","psychology","senses","video-oculography","visual"],"latest_commit_sha":null,"homepage":"","language":"Python","has_issues":true,"has_wiki":null,"has_pages":null,"mirror_url":null,"source_name":null,"license":"gpl-3.0","status":null,"scm":"git","pull_requests_enabled":true,"icon_url":"https://github.com/simonarvin.png","metadata":{"files":{"readme":"README.md","changelog":null,"contributing":null,"funding":null,"license":"LICENSE","code_of_conduct":"CODE_OF_CONDUCT.md","threat_model":null,"audit":null,"citation":null,"codeowners":null,"security":null,"support":null}},"created_at":"2020-07-02T08:35:39.000Z","updated_at":"2024-10-30T03:22:33.000Z","dependencies_parsed_at":"2023-01-20T16:46:58.646Z","dependency_job_id":null,"html_url":"https://github.com/simonarvin/eyeloop","commit_stats":null,"previous_names":[],"tags_count":1,"template":false,"template_full_name":null,"repository_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/simonarvin%2Feyeloop","tags_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/simonarvin%2Feyeloop/tags","releases_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/simonarvin%2Feyeloop/releases","manifests_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/simonarvin%2Feyeloop/manifests","owner_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners/simonarvin","download_url":"https://codeload.github.com/simonarvin/eyeloop/tar.gz/refs/heads/master","host":{"name":"GitHub","url":"https://github.com","kind":"github","repositories_count":246166975,"owners_count":20734376,"icon_url":"https://github.com/github.png","version":null,"created_at":"2022-05-30T11:31:42.601Z","updated_at":"2022-07-04T15:15:14.044Z","host_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub","repositories_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories","repository_names_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repository_names","owners_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners"}},"keywords":["eye-tracking","neurology","neuroscience","psychology","senses","video-oculography","visual"],"created_at":"2024-07-31T22:01:38.344Z","updated_at":"2025-03-29T09:30:35.580Z","avatar_url":"https://github.com/simonarvin.png","language":"Python","readme":"# EyeLoop [![License: GPL v3](https://img.shields.io/badge/License-GPLv3-blue.svg)](https://www.gnu.org/licenses/gpl-3.0) [![contributions welcome](https://img.shields.io/badge/contributions-welcome-brightgreen.svg?style=flat)](https://github.com/simonarvin/eyeloop/issues) [![Build Status](https://travis-ci.com/simonarvin/eyeloop.svg?branch=master)](https://travis-ci.com/simonarvin/eyeloop) ![version](https://img.shields.io/badge/version-0.35--beta-brightgreen) ![lab](https://img.shields.io/badge/yonehara-lab-blue) ![beta](https://img.shields.io/badge/-beta-orange)\n\n\u003cp align=\"center\"\u003e\n\u003cimg src=\"https://github.com/simonarvin/eyeloop/blob/master/misc/imgs/logo.svg?raw=true\" width = \"280\"\u003e\n\u003c/p\u003e\n\u003cp align=\"center\"\u003e\n\u003cimg src=\"https://github.com/simonarvin/eyeloop/blob/master/misc/imgs/eyeloop%20overview.svg?raw=true\"\u003e\n\u003c/p\u003e\n\n\u003cp align=\"center\"\u003e\n    \u003cimg src=\"https://github.com/simonarvin/eyeloop/blob/master/misc/imgs/sample_1.gif?raw=true\" align=\"center\" height=\"150\"\u003e\u0026nbsp; \u003cimg src=\"https://github.com/simonarvin/eyeloop/blob/master/misc/imgs/sample_3.gif?raw=true\" align=\"center\" height=\"150\"\u003e\u0026nbsp; \u003cimg src=\"https://github.com/simonarvin/eyeloop/blob/master/misc/imgs/sample_4.gif?raw=true\" align=\"center\" height=\"150\"\u003e\n  \u003c/p\u003e\n\nEyeLoop is a Python 3-based eye-tracker tailored specifically to dynamic, closed-loop experiments on consumer-grade hardware. Users are encouraged to contribute to EyeLoop's development.\n\n## Features ##\n- [x] **High-speed** \u003e 1000 Hz on non-specialized hardware (no dedicated processing units necessary).\n- [x] Modular, readable, **customizable**.\n- [x] **Open-source**, and entirely Python 3.\n- [x] **Works on any platform**, easy installation.\n\n## Overview ##\n- [How it works](#how-it-works)\n- [Getting started](#getting-started)\n- [Your first experiment](#designing-your-first-experiment)\n- [Data](#data)\n- [User interface](#graphical-user-interface)\n- [Authors](#authors)\n- [Examples](https://github.com/simonarvin/eyeloop/blob/master/examples)\n- [*EyeLoop Playground*](https://github.com/simonarvin/eyeloop_playground)\n\n## How it works ##\n\u003cp align=\"center\"\u003e\n\u003cimg src=\"https://github.com/simonarvin/eyeloop/blob/master/misc/imgs/software%20logic.svg?raw=true\" width = \"500\"\u003e\n\u003c/p\u003e\n\nEyeLoop consists of two functional domains: the engine and the optional modules. The engine performs the eye-tracking, whereas the modules perform optional tasks, such as:\n\n- Experiments\n- Data acquisition\n- Importing video sequences to the engine\n\n\u003e The modules import or extract data from the engine, and are therefore called *Importers* and *Extractors*, respectively.\n\nOne of EyeLoop's most appealing features is its modularity: Experiments are built simply by combining modules with the core Engine. Thus, the Engine has one task only: to compute eye-tracking data based on an *imported* sequence, and offer the generated data for *extraction*.\n\n\u003e How does [the Engine](https://github.com/simonarvin/eyeloop/blob/master/eyeloop/engine/README.md) work?\\\n\u003e How does [the Importer](https://github.com/simonarvin/eyeloop/blob/master/eyeloop/importers/README.md) work?\\\n\u003e How does [the Extractor](https://github.com/simonarvin/eyeloop/blob/master/eyeloop/extractors/README.md) work?\n\n## Getting started ##\n\n### Installation ###\n\nInstall EyeLoop by cloning the repository:\n```\ngit clone https://github.com/simonarvin/eyeloop.git\n```\n\n\u003eDependencies: ```python -m pip install -r requirements.txt```\n\n\u003eUsing pip:\n\u003e ```pip install .```\n\nYou may want to use a Conda or Python virtual environment when\ninstalling `eyeloop`, to avoid mixing up with your system dependencies.\n\n\u003eUsing pip and a virtual environment:\n\n\u003e ```python -m venv venv```\n\n\u003e ```source venv/bin/activate```\n\n\u003e ```(venv) pip install .```\n\nAlternatively:\n\n\u003e- numpy: ```python pip install numpy```\n\u003e- opencv: ```python pip install opencv-python```\n\nTo download full examples with footage, check out EyeLoop's playground repository:\n\n```\ngit clone https://github.com/simonarvin/eyeloop_playground.git\n```\n\n---\n\n### Initiation ###\n\nEyeLoop is initiated through the command-line utility `eyeloop`.\n```\neyeloop\n```\nTo access the video sequence, EyeLoop must be connected to an appropriate *importer class* module. Usually, the default opencv importer class (*cv*) is sufficient. For some machine vision cameras, however, a vimba-based importer (*vimba*) is neccessary.\n```\neyeloop --importer cv/vimba\n```\n\u003e [Click here](https://github.com/simonarvin/eyeloop/blob/master/eyeloop/importers/README.md) for more information on *importers*.\n\nTo perform offline eye-tracking, we pass the video argument ```--video``` with the path of the video sequence:\n```\neyeloop --video [file]/[folder]\n```\n\u003cp align=\"right\"\u003e\n    \u003cimg src=\"https://github.com/simonarvin/eyeloop/blob/master/misc/imgs/models.svg?raw=true\" align=\"right\" height=\"150\"\u003e\n\u003c/p\u003e\n\nEyeLoop can be used on a multitude of eye types, including rodents, human and non-human primates. Specifically, users can suit their eye-tracking session to any species using the ```--model``` argument.\n\n```\neyeloop --model ellipsoid/circular\n```\n\u003e In general, the ellipsoid pupil model is best suited for rodents, whereas the circular model is best suited for primates.\n\nTo learn how to optimize EyeLoop for your video material, see [*EyeLoop Playground*](https://github.com/simonarvin/eyeloop_playground).\n\nTo see all command-line arguments, pass:\n\n```\neyeloop --help\n```\n\n## Designing your first experiment ##\n\n\u003cp align=\"center\"\u003e\n    \u003cimg src=\"https://github.com/simonarvin/eyeloop/blob/master/misc/imgs/setup.svg?raw=true\" align=\"center\" height=\"250\"\u003e\n\u003c/p\u003e\n\nIn EyeLoop, experiments are built by stacking modules. By default, EyeLoop imports two base *extractors*, namely a FPS-counter and a data acquisition tool. To add custom extractors, e.g., for experimental purposes, use the argument tag ```--extractors```:\n\n```\neyeloop --extractors [file_path]/p (where p = file prompt)\n```\n\nInside the *extractor* file, or a composite python file containing several *extractors*, define the list of *extractors* to be added:\n```python\nextractors_add = [extractor1, extractor2, etc]\n```\n\n*Extractors* are instantiated by EyeLoop at start-up. Then, at every subsequent time-step, the *extractor's* ```fetch()``` function is called by the engine.\n```python\nclass Extractor:\n    def __init__(self) -\u003e None:\n        ...\n    def fetch(self, core) -\u003e None:\n        ...\n```\n```fetch()``` gains access to all eye-tracking data in real-time via the *core* pointer.\n\n\u003e [Click here](https://github.com/simonarvin/eyeloop/blob/master/eyeloop/extractors/README.md) for more information on *extractors*.\n\n### Open-loop example ###\n\nAs an example, we'll here design a simple *open-loop* experiment where the brightness of a PC monitor is linked to the phase of the sine wave function. We create anew python-file, say \"*test_ex.py*\", and in it define the sine wave frequency and phase using the instantiator:\n```python\nclass Experiment:\n    def __init__(self) -\u003e None:\n        self.frequency = ...\n        self.phase = 0\n```\nThen, by using ```fetch()```, we shift the phase of the sine wave function at every time-step, and use this to control the brightness of a cv-render.\n```python\n    ...\n    def fetch(self, engine) -\u003e None:\n        self.phase += self.frequency\n        sine = numpy.sin(self.phase) * .5 + .5\n        brightness = numpy.ones((height, width), dtype=float) * sine\n        cv2.imshow(\"Experiment\", brightness)\n```\n\nTo add our test extractor to EyeLoop, we'll need to define an extractors_add array:\n```python\nextractors_add = [Experiment()]\n```\n\nFinally, we test the experiment by running command:\n```\neyeloop --extractors path/to/test_ex.py\n```\n\n\u003e See [Examples](https://github.com/simonarvin/eyeloop/blob/master/examples) for demo recordings and experimental designs.\n\n\u003e For extensive test data, see [*EyeLoop Playground*](https://github.com/simonarvin/eyeloop_playground)\n\n\n## Data ##\nEyeLoop produces a json-datalog for each eye-tracking session. The datalog's first column is the timestamp.\nThe next columns define the pupil (if tracked):\n\n```((center_x, center_y), radius1, radius2, angle)```\n\nThe next columns define the corneal reflection (if tracked):\n\n```((center_x, center_y), radius1, radius2, angle)```\n\nThe next columns contain any data produced by custom Extractor modules\n\n\n## Graphical user interface ##\nThe default graphical user interface in EyeLoop is [*minimum-gui*.](https://github.com/simonarvin/eyeloop/blob/master/eyeloop/guis/minimum/README.md)\n\n\u003e EyeLoop is compatible with custom graphical user interfaces through its modular logic. [Click here](https://github.com/simonarvin/eyeloop/blob/master/eyeloop/guis/README.md) for instructions on how to build your own.\n\n## Running unit tests ##\n\nInstall testing requirements by running in a terminal:\n\n`pip install -r requirements_testing.txt`\n\nThen run tox: `tox`\n\nReports and results will be outputted to `/tests/reports`\n\n\n## Known issues ##\n- [ ] Respawning/freezing windows when running *minimum-gui* in Ubuntu.\n\n## References ##\nIf you use any of this code or data, please cite [Arvin et al. 2021] ([article](https://www.frontiersin.org/articles/10.3389/fncel.2021.779628/full)).\n```latex\n\n@ARTICLE{Arvin2021-tg,\n  title    = \"{EyeLoop}: An open-source system for high-speed, closed-loop\n              eye-tracking\",\n  author   = \"Arvin, Simon and Rasmussen, Rune and Yonehara, Keisuke\",\n  journal  = \"Front. Cell. Neurosci.\",\n  volume   =  15,\n  pages    = \"494\",\n  year     =  2021\n}\n\n```\n\n## License ##\nThis project is licensed under the GNU General Public License v3.0. Note that the software is provided \"as is\", without warranty of any kind, express or implied.\n\n## Authors ##\n\n**Lead Developer:**\nSimon Arvin, sarv@dandrite.au.dk\n\u003cp align=\"right\"\u003e\n    \u003cimg src=\"https://github.com/simonarvin/eyeloop/blob/master/misc/imgs/constant.svg?raw=true\" align=\"right\" height=\"180\"\u003e\n    \u003c/p\u003e\n\n**Researchers:**\n\n- Simon Arvin, sarv@dandrite.au.dk\n- Rune Rasmussen, runerasmussen@biomed.au.dk\n- Keisuke Yonehara, keisuke.yonehara@dandrite.au.dk\n\n**Corresponding Author:**\nKeisuke Yonehera, keisuke.yonehara@dandrite.au.dk\u003c/br\u003e\u003c/br\u003e\n\n---\n\u003cp align=\"center\"\u003e\n    \u003cimg src=\"https://github.com/simonarvin/eyeloop/blob/master/misc/imgs/aarhusuniversity.svg?raw=true\" align=\"center\" height=\"40\"\u003e\u0026nbsp;\u0026nbsp;\u0026nbsp;\u0026nbsp;\n    \u003cimg src=\"https://github.com/simonarvin/eyeloop/blob/master/misc/imgs/dandrite.svg?raw=true\" align=\"center\" height=\"40\"\u003e\u0026nbsp;\u0026nbsp;\u0026nbsp;\u0026nbsp;\n    \u003cimg src=\"https://github.com/simonarvin/eyeloop/blob/master/misc/imgs/nordicembl.svg?raw=true\" align=\"center\" height=\"40\"\u003e\n\u003c/p\u003e\n\u003cp align=\"center\"\u003e\n    \u003ca href=\"http://www.yoneharalab.com\"\u003e\n    \u003cimg src=\"https://github.com/simonarvin/eyeloop/blob/master/misc/imgs/yoneharalab.svg?raw=true\" align=\"center\" height=\"18\"\u003e\u0026nbsp;\u0026nbsp;\u0026nbsp;\u0026nbsp;\n    \u003c/a\u003e\n    \u003c/p\u003e\n","funding_links":[],"categories":["Python","Psychometrics"],"sub_categories":["Tools for tests and experiments"],"project_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fsimonarvin%2Feyeloop","html_url":"https://awesome.ecosyste.ms/projects/github.com%2Fsimonarvin%2Feyeloop","lists_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fsimonarvin%2Feyeloop/lists"}