Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/szymciem8/openleap
Hand gesture Python module
https://github.com/szymciem8/openleap
gesture-recognition library machine-learning mediapipe opencv python scikit-learn
Last synced: about 2 months ago
JSON representation
Hand gesture Python module
- Host: GitHub
- URL: https://github.com/szymciem8/openleap
- Owner: szymciem8
- License: mit
- Created: 2021-10-11T15:38:18.000Z (about 3 years ago)
- Default Branch: main
- Last Pushed: 2022-01-27T10:10:34.000Z (almost 3 years ago)
- Last Synced: 2024-10-10T08:21:06.363Z (3 months ago)
- Topics: gesture-recognition, library, machine-learning, mediapipe, opencv, python, scikit-learn
- Language: Jupyter Notebook
- Homepage:
- Size: 78.1 MB
- Stars: 2
- Watchers: 0
- Forks: 0
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
- License: LICENSE
Awesome Lists containing this project
README
# OpenLeap
![python][] ![license][] [![pypi][]](https://pypi.org/project/openleap/)## Table of contents
- [OpenLeap](#openleap)
- [Table of contents](#table-of-contents)
- [General Info](#general-info)
- [Technologies](#technologies)
- [Setup](#setup)
- [Simple Example](#simple-example)
- [Access Hand Information](#access-hand-information)
- [Example](#example)
- [Another Example](#another-example)## General Info
OpenLeap is an open source project that allows you to add hand gesture control to your Python projects.## Technologies
Project was created with technologies:
- Python
- OpenCV
- MediaPipe
- SciKit Learn## Setup
OpenLeap can be installed using pip, as shown below.```
$ pip install openleap
```## Simple Example
Test openleap controller with an example program. Code below will create an instance of opencv window with feed from the camera.
```
import openleapcontroller = openleap.OpenLeap(screen_show=True,
screeen_type='BLACK',
show_data_on_image=True,
show_data_in_console=True,
gesture_model='sign_language')controller.loop()
```
OpenLeap returns relative position of each hand, distance between thumb tip and index finger tip, rotation angle by wrist point and recognized gesture. There are two models for gesture recognition.
The first one can recognized wheter hand is opened or closed into fist, second model can recognized sign language alphabet as shown below.
OpenLeap object can be created with couple of options.
- **screen_show** - if set to True, window with camera feed will be created.
- **screen_type** - "CAM" or "BLACK" background.
- **show_data_on_image** - descriptive
- **show_data_in_console** - descriptive
- **gesture_model** - chose gesture recognition model, "basic" or "sign_language"## Access Hand Information
Recognized gestures, hand position, tilt and so on are stored in a dictionary called 'data' that consists of two dataclass objects for right and left hand. Dataclass object is of given structure:
```
@dataclass
class Data:
x : float = 0
y : float = 0
z : float = 0
distance: float = 0.0
angle: float = 0.0
gesture: str = None
```Dataclass containing all of the data above is continuously being updated in **main()** or **loop()** function depending on which one is being used.
### Example
```
if controller.data['right'].gesture == 'open':
print('Right hand is opened!')
elif controller.data['right'].gesture == 'fist':
print('Right hand is closed!')
```### Another Example
```
if controller.data['right'].distance < 20:
print('Click has been detected!')
```[python]:https://img.shields.io/pypi/pyversions/openleap
[pypi]:https://img.shields.io/pypi/v/openleap
[license]:https://img.shields.io/github/license/szymciem8/OpenLeap