https://github.com/nativesensors/eyegestureslite
https://github.com/nativesensors/eyegestureslite
Last synced: 6 months ago
JSON representation
- Host: GitHub
- URL: https://github.com/nativesensors/eyegestureslite
- Owner: NativeSensors
- License: other
- Created: 2025-01-04T09:56:05.000Z (10 months ago)
- Default Branch: main
- Last Pushed: 2025-04-06T20:47:53.000Z (7 months ago)
- Last Synced: 2025-05-12T22:40:49.179Z (6 months ago)
- Language: JavaScript
- Size: 229 KB
- Stars: 22
- Watchers: 0
- Forks: 1
- Open Issues: 3
-
Metadata Files:
- Readme: README.md
- License: License
Awesome Lists containing this project
README
## EYEGESTURES
EyeGestures is open source eyetracking software/library using native webcams and phone camers for achieving its goal. The aim of library is to bring accessibility of eye-tracking and eye-driven interfaces without requirement of obtaining expensive hardware.
Our [Mission](https://github.com/NativeSensors/EyeGestures/blob/main/MISSION.md)!
# EyeGesturesLite
EyeGesturesLite is JavaScript implementation of [EyeGestures algoritm](https://github.com/NativeSensors/EyeGestures). If you need python version, check original repository.
### How does it work?
It is a gaze tracker that uses machine learning and built-in cameras (such as a webcam) to provide gaze tracking for the user. It includes a built-in calibration mode that displays 20 red circles for the user to focus on, along with a blue cursor that follows the user’s gaze. During the calibration process, the cursor will gradually start following user's gaze more and more. By the end of the 20 points, the cursor should be able to independently follow the user’s gaze.
### ⚙️ Try:
[EyeGesturesLite](https://eyegestures.com/tryLite)
### 🔧 Build your own:
#### CDN:
0. External dependencies CDNs:
```html
```
1. You need two CDN links:
```html
```
2. Place `video` element (which can be hidden) somewhere in the page together with status and error divs (can stay invisible):
```html
Initializing...
```
3. Then javascript interface code:
```html
function onPoint(point,calibration){
point[0]; // x
point[1]; // y
calibration; // true - for calibrated data, false if calibration is ongoing
};
const gestures = new EyeGestures('video',onPoint);
// gestures.invisible(); // to disable blue tracker
gestures.start();
```
#### NPM package [WiP]:
1. Install npm package:
```
npm install eyegestures
```
2. Place `video` element in your DOM (which can be hidden) somewhere in the page together with status and error divs (can stay invisible):
```html
Initializing...
```
3. try to import:
```JS
import EyeGestures from 'eyegestures';
const gestures = new EyeGestures("video_element_id",(point,calibration)=>{/*point,calibration*/})
// after this call: gestures.start();
console.log(gestures);
```
>[!Warning]
> EyeGestures needs DOM to operate and its constructor expects to receive video element/camera feed id string.
>
### rules of using
You can use it free of charge as long as you keep our logo. If you want to remove logo then contact: contact@eyegestures.com.
### 📇 Find us:
- [discord](https://discord.gg/FvagCX8T4h)
- [twitter](https://twitter.com/PW4ltz)
- [daily.dev](https://dly.to/JEe1Sz6vLey)
- email: contact@eyegestures.com
### Troubleshooting:
### 💻 Contributors
### 💵 Support the project
We will be extremely grateful for your support: it helps to keep server running + fuels our brains with coffee.
Support project on Polar (in exchange we provide access to alphas versions!):
