https://github.com/tonysy/radar_project
"Use 3d-cnn to recognize gesture with lidar"
https://github.com/tonysy/radar_project
deep-learning
Last synced: 5 months ago
JSON representation
"Use 3d-cnn to recognize gesture with lidar"
- Host: GitHub
- URL: https://github.com/tonysy/radar_project
- Owner: tonysy
- Created: 2017-03-12T16:11:53.000Z (about 8 years ago)
- Default Branch: master
- Last Pushed: 2017-05-02T08:52:10.000Z (about 8 years ago)
- Last Synced: 2025-01-06T02:56:58.366Z (5 months ago)
- Topics: deep-learning
- Language: Python
- Homepage:
- Size: 128 MB
- Stars: 5
- Watchers: 3
- Forks: 0
- Open Issues: 1
-
Metadata Files:
- Readme: README.md
Awesome Lists containing this project
README
# Radar Gesture Recognization
## Requirements
- Keras
- pydot(pip install pydot==1.1.0)
- Tensorflow## Dataset
We create a lidar gesture image dataset which included 4 classes(backward, forward, static, rotate), and will be release soon.## Prepare
- `mkdir data` and put the data in this folder
- `mkdir logs` which is used to store logs file for TensorBoard## Train
- `python net/train.py`, then you will get network weight in `model` folder, where existed model trained by us already. And the network architecture is stored in `network.json`, which can be read directly when test new data.
- After the train process stop, there will be a png file in `model`, which show the `val_acc` and `val_loss` curve.## Test
Please `from net.testor import Gesture_Testor` when you want to predict the new data without training process.We have already pack up the test process into a class, and you can directly use
`label = Gesture_Testor().test(test_data)` to get the label belong to the test_data.The test_data in `test_new.py` is just the first sample of the train dataset.
## Then
if you have any problem, feel free to contact with me by issue or email.