Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/um-arm-lab/contact_shape_completion
Shape completion using contact measurements. Successor of PSSNet
https://github.com/um-arm-lab/contact_shape_completion
Last synced: 4 days ago
JSON representation
Shape completion using contact measurements. Successor of PSSNet
- Host: GitHub
- URL: https://github.com/um-arm-lab/contact_shape_completion
- Owner: UM-ARM-Lab
- Created: 2020-11-10T15:18:22.000Z (about 4 years ago)
- Default Branch: main
- Last Pushed: 2021-10-24T23:05:24.000Z (about 3 years ago)
- Last Synced: 2023-08-24T16:14:16.330Z (about 1 year ago)
- Language: Python
- Size: 6.58 MB
- Stars: 0
- Watchers: 10
- Forks: 0
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
Awesome Lists containing this project
README
# CLASP
Constrained LAtent Shape Projection (CLASP) combines a shape completion neural network with contact measurements from a robot. This repo also contains the [full paper](https://github.com/UM-ARM-Lab/contact_shape_completion/blob/main/CLASP_full.pdf) and [additional plots](https://github.com/UM-ARM-Lab/contact_shape_completion/blob/main/CLASP_additional_details.pdf).
# Quick Start
1. Set up ROS
2. Clone this repo in your ROS path. Rebuild (e.g. catkin build), re-source
3. Install dependencies
4. Download datasets and pretrained models by running `shape_completion_training/scripts/download_pretrained.py`### Data Analysis
Trial results are in `./evaluations`
To recreate the plots from the paper using the pre-run trials, `contact_completion_evaluation.py --plot`### Rerun shape completion experiments and visualize performance
To rerun shape completion experiments using the robot motion and contacts as recorded from the paper, in separate terminals run:
1. `roslaunch shape_completion_visualization live_shape_completion.launch` (Rviz will start)
2. `roslaunch shape_completion_visualization transforms.launch` (Sets transforms between robot and shape objects)
3. `contact_completion_evaluation.py --plot --regenerate` (You will see many shape being generated and updates. Will take over an hour to complete)# Full Stack
The full experimental setup requires running a simulated, or real robot, which moves and contacts objects.
To build the software stack used in the experiments, set up the dependencies.Detailed instructions are in the subfolder [https://github.com/UM-ARM-Lab/contact_shape_completion/tree/main/contact_shape_completion]
Then run
1. `roslaunch shape_completion_visualization live_shape_completion.launch` (Rviz will start)
2. Launch the robot stack (see detailed instructions in subfolder)
3. `store_simulation_examples --trial [PRETRAINED_NETWORK_NAME] --scene [SCENE_NAME] --store`