Ecosyste.ms: Awesome

An open API service indexing awesome lists of open source software.

Awesome Lists | Featured Topics | Projects

https://github.com/RichardoMrMu/awesome-gaze-estimation-new

material about gaze estimation or gaze tracking for codes, papers and demos.
https://github.com/RichardoMrMu/awesome-gaze-estimation-new

List: awesome-gaze-estimation-new

deep-learning eye-tracker eye-tracking gaze-estimation gaze-estimation-model gaze-tracking human-computer-interaction machine-learning

Last synced: 16 days ago
JSON representation

material about gaze estimation or gaze tracking for codes, papers and demos.

Awesome Lists containing this project

README

        

## Some journals you may interest:
[Journal Home | JOV | ARVO Journals ](https://jov.arvojournals.org/)

[Journal of Eye Movement Research (JEMR)](https://bop.unibe.ch/jemr)

## Some conferences you may know:
[ETRA](http://etra.acm.org/2021/index.html) Eye-tracking Research and Applications

* 偏计算机,算法及交互

[Gaze Estimation and Prediction in the Wild - work shop](https://gazeworkshop.github.io/2021/)
* 由ETH的[Xucong Zhang](https://ait.ethz.ch/people/zhang/),[Seonwook Park](https://ait.ethz.ch/people/spark/),University of Birmingham的[Hyung Jin Chang ](https://hyungjinchang.wordpress.com/)等举办

[COGAIN (Communication by Gaze Interaction)](https://www.cogain.org/)

* 比较注重对残障人士辅助交互,近年来合在ETRA一起举办

[VSS (Vision Science Society)](https://www.visionsciences.org/)

* 偏心理生理神经以及医学和眼科学科。

[ECVP (European Conference on Visual Perception) ](http://ecvp.org/)

* 偏心理学,神经科学和认知科学

ECEM (European Conference on Eye Movements)

## Some researchers you may interest:
**Advanced Interactive Technologies** - Institute for Intelligent Interactive Systems, ETH Zürich - [AIT](https://ait.ethz.ch/index.php)

* [Xucong Zhang](https://ait.ethz.ch/people/zhang/)

* [Yufeng Zheng](https://ait.ethz.ch/people/zhengyuf/)

* [Seonwook Park](https://ait.ethz.ch/people/spark/)

[**Distributed Systems Group**](http://www.vs.inf.ethz.ch/) - ETH Zurich

**Perception & Hybrid Interaction (PHI) for Augmented Intelligence (AI)** - BeiHang University - [PHI](http://phi-ai.org/default.htm)

* [Lu Feng](http://phi-ai.org/members/default.htm)

**Cognitive Interaction Technology Center of Excellence** - Uni Bielefeld - [CITEC ](https://www.cit-ec.de/en)

* Dr. rer. nat. Thies Pfeiffer (Dipl. Inform.)

**Usable Security and Privacy** - Bundeswehr University

* [Prof. Dr. Florian Alt ](http://www.florian-alt.org/academic/)

**Medieninformatik** - University of Glasgow

* Dr. Mohamed Khamis

**Human-Computer Interaction and Cognitive Systems** Institute for Visualization and Interactive
Systems (VIS)

* Prof. Dr. Andreas Bulling

**Group for Median Informatics** - University of Munich

**Perception Engineering, Department of Computer Science** - University of Tübingen

* Prof. Dr. Enkelejda Kasneci

**Quality and Usability Lab** - TU Berlin

* [Dr.-Ing. Jan-Niklas Voigt-Antons](https://www.qu.tu-berlin.de/menue/team/senior_researchers/antons_jan_niklas/) (diploma in psychology)

**Engineering Psychology and Applied Cognitive Research** - 工程心理学与应用认知研究 TU Dresden

* Prof. Dr. SebastianPannasch

**General Psychology** - University of Ulm

* [Prof. Dr. Anke Huckauf ](https://www.uni-ulm.de/in/psy-allg/team/anke-huckauf/)

## Some study

* [kasprowski etra2021](https://github.com/kasprowski/etra2021) etra2021-tutorials-deep learning in eye tracking world

## Some firms you may interest:

* [Blickshift](https://www.blickshift.com/) in Stuttgart

* [Pupil labs](https://pupil-labs.com/) in Berlin

* [七鑫易维 ](https://www.7invensun.com/)

* [青研科技](http://www.qingtech.com.cn/jj/index_14.aspx)

## Some interesting web

基于电脑前置摄像头的眼动跟踪,可自己上传图片等测试 https://app.gazerecorder.com/
### 3d gaze vector

* [清帆科技](https://www.qingfan.com/resources) 视线方向追踪
* [GazeRecorder](https://gazerecorder.com/download-gazerecorder/) Webcam Eye Tracking
* [pygaze](http://www.pygaze.org/) use PyGaze toolbox for box tracking

### 2d points of gaze
* [lookie-lookie](https://cpury.github.io/lookie-lookie/) Javascript demo with Tensorflow.js

* [RealEye demo](https://www.realeye.io/test/172d467f-b8bf-45e4-b11a-d2f24f788d12/run)
* [digital-twins](https://www.edusense.io/digital-twins) class eyes tracking . This is [code](https://github.com/edusense/ClassroomDigitialTwins) and this is [youtube video](https://www.youtube.com/watch?v=N2nW7sHL2Ng)
* [WebGazer](https://github.com/brownhci/WebGazer)
* [GazeTracking](https://github.com/antoinelame/GazeTracking)

## Some compititons u need to know
* [ETH-XGaze Challenge](https://competitions.codalab.org/competitions/28930),based on [ETH-XGaze](https://ait.ethz.ch/projects/2020/ETH-XGaze/) project, and u can find the [code ](https://github.com/xucong-zhang/ETH-XGaze) here. The Xgaze Dataset iDataset (face patch images with size of 224*224 pixels about 130 GB, face patch images with size of 448*448 pixels about 497 GB, and the full raw images about 7 TB). The ETH-XGaze Dataset is available on request. Please register [here](https://docs.google.com/forms/d/e/1FAIpQLScaGNYTVI7-h8ZHu9y_kQzhC1Ab4fo4fXtRDMNZ5y2wpLx3MA/viewform?usp=sf_link) to gain access to the dataset.
* [EVE Challenge](https://competitions.codalab.org/competitions/28954), based on [Towards End-to-end Video-based Eye-tracking](https://ait.ethz.ch/projects/2020/EVE/), and u can find the code in [this](https://github.com/swook/EVE/). based on EVE dataset. The EVE Dataset is available on request. Please fill in [this Google Form](https://docs.google.com/forms/d/e/1FAIpQLSfZtMVpNbWV9yHX5toXVzVpDpOENy-SB7XfMIx5V6u7sITuNg/viewform?usp=sf_link) to gain access to the dataset.