Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/charliegerard/gaze-detection
👀 Use machine learning in JavaScript to detect eye movements and build gaze-controlled experiences.
https://github.com/charliegerard/gaze-detection
creative-coding eye-detection eye-tracking frontend human-computer-interaction javascript machine-learning tensorflow tensorflowjs tfjs
Last synced: 2 days ago
JSON representation
👀 Use machine learning in JavaScript to detect eye movements and build gaze-controlled experiences.
- Host: GitHub
- URL: https://github.com/charliegerard/gaze-detection
- Owner: charliegerard
- License: gpl-3.0
- Created: 2021-01-30T13:20:15.000Z (almost 4 years ago)
- Default Branch: main
- Last Pushed: 2021-06-24T17:01:37.000Z (over 3 years ago)
- Last Synced: 2024-04-14T04:35:30.215Z (8 months ago)
- Topics: creative-coding, eye-detection, eye-tracking, frontend, human-computer-interaction, javascript, machine-learning, tensorflow, tensorflowjs, tfjs
- Language: JavaScript
- Homepage: https://gaze-keyboard.netlify.app/
- Size: 6.32 MB
- Stars: 579
- Watchers: 19
- Forks: 40
- Open Issues: 4
-
Metadata Files:
- Readme: README.md
- Funding: .github/FUNDING.yml
- License: LICENSE.md
Awesome Lists containing this project
- awesome-coding-by-voice - Gaze-detection - Use machine learning in JavaScript to detect eye movements and build gaze-controlled experiences!
- awesome-starred - charliegerard/gaze-detection - 👀 Use machine learning in JavaScript to detect eye movements and build gaze-controlled experiences. (tensorflow)
README
# Gaze-detection
Use machine learning in JavaScript to detect eye movements and build gaze-controlled experiences!
## Demo
Visit [https://gaze-keyboard.netlify.app/](https://gaze-keyboard.netlify.app/) _(Works well on mobile too!!)_ 😃
![](gaze-demo.gif)
_Inspired by the Android application ["Look to speak"](https://play.google.com/store/apps/details?id=com.androidexperiments.looktospeak)._
Uses Tensorflow.js's [face landmark detection model](https://www.npmjs.com/package/@tensorflow-models/face-landmarks-detection).
## Detection
This tool detects when the user looks right, left, up and straight forward.
## How to use
### Install
As a module:
```bash
npm install gaze-detection --save
```### Code sample
Start by importing it:
```js
import gaze from "gaze-detection";
```Load the machine learning model:
```js
await gaze.loadModel();
```Then, set up the camera feed needed for the detection. The `setUpCamera` method needs a `video` HTML element and, optionally, a camera device ID if you are using more than the default webcam.
```js
const videoElement = document.querySelector("video");const init = async () => {
// Using the default webcam
await gaze.setUpCamera(videoElement);// Or, using more camera input devices
const mediaDevices = await navigator.mediaDevices.enumerateDevices();
const camera = mediaDevices.find(
(device) =>
device.kind === "videoinput" &&
device.label.includes(/* The label from the list of available devices*/)
);await gaze.setUpCamera(videoElement, camera.deviceId);
};
```Run the predictions:
```js
const predict = async () => {
const gazePrediction = await gaze.getGazePrediction();
console.log("Gaze direction: ", gazePrediction); //will return 'RIGHT', 'LEFT', 'STRAIGHT' or 'TOP'
if (gazePrediction === "RIGHT") {
// do something when the user looks to the right
}
let raf = requestAnimationFrame(predict);
};
predict();
```Stop the detection:
```js
cancelAnimationFrame(raf);
```