Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/sn1027/hand-gesture-recognition-using-machine-learning-and-mediapipe
This Hand gesture recognition project using mediapipe is developed to recognize various hand gestures. The user can custom train any number of various hand gestures to train a model.
https://github.com/sn1027/hand-gesture-recognition-using-machine-learning-and-mediapipe
computer-vision customization custommodels handgesture-recognition machine-learning mediapipe modeltraining scikit-learn scikitlearn-machine-learning
Last synced: about 1 month ago
JSON representation
This Hand gesture recognition project using mediapipe is developed to recognize various hand gestures. The user can custom train any number of various hand gestures to train a model.
- Host: GitHub
- URL: https://github.com/sn1027/hand-gesture-recognition-using-machine-learning-and-mediapipe
- Owner: SN1027
- Created: 2024-11-06T17:28:23.000Z (about 2 months ago)
- Default Branch: main
- Last Pushed: 2024-11-06T18:06:05.000Z (about 2 months ago)
- Last Synced: 2024-11-23T07:07:30.769Z (about 1 month ago)
- Topics: computer-vision, customization, custommodels, handgesture-recognition, machine-learning, mediapipe, modeltraining, scikit-learn, scikitlearn-machine-learning
- Language: Python
- Homepage:
- Size: 1.04 MB
- Stars: 2
- Watchers: 3
- Forks: 0
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
Awesome Lists containing this project
README
** Hand Gesture Recognition using Machine learning **
This project aims to harness the power of computer vision and machine learning
to recognize and translate hand gestures in real-time. By creating an
intuitive and accessible platform, we seek to empower disabled individuals,
enabling seamless interaction in everyday situations—whether in educational
settings, public services, or personal communication.**About the repositary files and order of execution of files:-**
**1)Handlandmarker.py**
This program is coded to access the camera and display the skeletal coordinates of the detected hands.
This is run to check if the camera is accessed and configured properly.**2)Handlandmarker_model.py**
This program is run to custom train various hand gestures that are shown to the camera with custom names as inputted by the user.
The gestures that are inputted are processed and made into a CSV file ('Handmarks.csv')and is stored locally for training a model.**3)Handlandmarktrain.py**
This program is run to develop a local trained model.pipelines = {
'lr':make_pipeline(StandardScaler() , LogisticRegression()),
'rc':make_pipeline(StandardScaler() , RidgeClassifier()),
'rf':make_pipeline(StandardScaler() , RandomForestClassifier()),
'gb':make_pipeline(StandardScaler() , GradientBoostingClassifier())
}This program develops four model based on the above mentioned algorithm.
The user can then check the accuracy score of each trained algorithm then choose the best fit algorithm.
for algo , model in fit_models.items():
pred = model.predict(X_test)
print(algo , accuracy_score(y_test , pred))
with open('Hand_test_new.pkl' , 'wb') as f:
pickle.dump(fit_models['rf'], f)The program then creates a pickle file(trained model) which can used to check of gestures in real time.
**4)handmarkeroutput.py**
This program runs to detect and recognize various gestures in real-time that are pre-trained in the model as done earlier.**I Hope this code is useful to developers out there and as well as students drop a like as an appreciation. Thanks in Advance ;)**