Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/brianseong99/vrdance
VRHardware Device & AI model that tracks users movement and maps it onto VR Avatars movement.
https://github.com/brianseong99/vrdance
Last synced: 25 days ago
JSON representation
VRHardware Device & AI model that tracks users movement and maps it onto VR Avatars movement.
- Host: GitHub
- URL: https://github.com/brianseong99/vrdance
- Owner: BrianSeong99
- Created: 2023-04-17T03:56:04.000Z (almost 2 years ago)
- Default Branch: main
- Last Pushed: 2023-10-22T04:31:39.000Z (over 1 year ago)
- Last Synced: 2024-11-08T13:56:20.623Z (3 months ago)
- Language: Jupyter Notebook
- Homepage:
- Size: 72.9 MB
- Stars: 0
- Watchers: 1
- Forks: 0
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
Awesome Lists containing this project
README
# VRDance (Let's TAICHI)
The intent was to develop a motion tracking hardware device, and train a LSTM model for mapping the physical movement into VR Avatar Movement. (But because of delay, we scoped down from Dance to TAICHI, lol).
The Technology is developed using MPU9250 for motion detection and ESP32 for data transmission. The data is collected live on MPU9250 after calibration, then the data is sent to Firebase cloud very 100ms.
I have trained a simple LSTM AI model using hardware movement data as input and using web camera to Avatar movement in Unity detection AI model(ThreeDPoseUnityBarracuda) as target outputs.
And I was able to achieve a simple rusty AI model that can translate users physical movement with hardware devices on their body into VR avatar movements.
![](./pics/01.jpg)
![](./pics/02.jpg)
![](./pics/03.jpg)
![](./pics/04.jpg)
![](./pics/05.jpg)
![](./pics/06.jpg)
![](./pics/07.jpg)
![](./pics/08.jpg)
![](./pics/09.jpg)
![](./pics/10.jpg)
![](./pics/11.jpg)