Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/um-arm-lab/visual_servoing
position based visual servoing library with PnP pose estimation
https://github.com/um-arm-lab/visual_servoing
Last synced: 4 days ago
JSON representation
position based visual servoing library with PnP pose estimation
- Host: GitHub
- URL: https://github.com/um-arm-lab/visual_servoing
- Owner: UM-ARM-Lab
- Created: 2022-01-14T22:52:17.000Z (almost 3 years ago)
- Default Branch: main
- Last Pushed: 2023-10-08T20:43:30.000Z (about 1 year ago)
- Last Synced: 2023-10-08T21:31:27.407Z (about 1 year ago)
- Language: Python
- Homepage:
- Size: 46.3 MB
- Stars: 5
- Watchers: 11
- Forks: 3
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
Awesome Lists containing this project
README
# Position Based Visual Servoing
This package contains code to do position based visual servoing in the PyBullet simulator.
The methods are explained in these two reporst:
- https://drive.google.com/file/d/1N3Cuwtxr-NA0eG1iCCvZUsygbrnd9Mni/view
- https://drive.google.com/file/d/1H7afL3Rfg1lv1DpsQ0mXd_qmte1Gj2ud/view?usp=sharingA demo of this in action can be seen here: https://drive.google.com/file/d/1BPksbPRiTzz8pHh8DmfJRQZ9ZBchlqJN/view?usp=sharing
To try out PBVS on Val with ARUCO markers, run scripts/marker_pbvs_demo. For ICP PBVS, run scripts/evaluation.py. Trajectories generated by evaluation.py can also be played back in rviz via the code in playback.py. To do this, `roslaunch launch/rviz_victor.launch` then run playback.py and select the generated result file from evaluation.py.
Note that for the code to work the working directory must be the top level of this repoistory. vscode configurations are included.
Dependencies:
- numpy
- PyBullet
- OpenCV + OpenCV extra modules
- rospy
- Tensorflow (working on removing this)You will need to run: `rosdep install -y -r --from-paths . --ignore-src`
**Credits**
- Using transformation functions from PyTorch 3D in utils.py