Ecosyste.ms: Awesome

An open API service indexing awesome lists of open source software.

Awesome Lists | Featured Topics | Projects

https://github.com/um-arm-lab/contact_shape_completion

Shape completion using contact measurements. Successor of PSSNet
https://github.com/um-arm-lab/contact_shape_completion

Last synced: 4 days ago
JSON representation

Shape completion using contact measurements. Successor of PSSNet

Awesome Lists containing this project

README

        

# CLASP

Constrained LAtent Shape Projection (CLASP) combines a shape completion neural network with contact measurements from a robot. This repo also contains the [full paper](https://github.com/UM-ARM-Lab/contact_shape_completion/blob/main/CLASP_full.pdf) and [additional plots](https://github.com/UM-ARM-Lab/contact_shape_completion/blob/main/CLASP_additional_details.pdf).

# Quick Start
1. Set up ROS
2. Clone this repo in your ROS path. Rebuild (e.g. catkin build), re-source
3. Install dependencies
4. Download datasets and pretrained models by running `shape_completion_training/scripts/download_pretrained.py`

### Data Analysis
Trial results are in `./evaluations`
To recreate the plots from the paper using the pre-run trials, `contact_completion_evaluation.py --plot`

### Rerun shape completion experiments and visualize performance
To rerun shape completion experiments using the robot motion and contacts as recorded from the paper, in separate terminals run:
1. `roslaunch shape_completion_visualization live_shape_completion.launch` (Rviz will start)
2. `roslaunch shape_completion_visualization transforms.launch` (Sets transforms between robot and shape objects)
3. `contact_completion_evaluation.py --plot --regenerate` (You will see many shape being generated and updates. Will take over an hour to complete)

# Full Stack
The full experimental setup requires running a simulated, or real robot, which moves and contacts objects.
To build the software stack used in the experiments, set up the dependencies.

Detailed instructions are in the subfolder [https://github.com/UM-ARM-Lab/contact_shape_completion/tree/main/contact_shape_completion]

Then run
1. `roslaunch shape_completion_visualization live_shape_completion.launch` (Rviz will start)
2. Launch the robot stack (see detailed instructions in subfolder)
3. `store_simulation_examples --trial [PRETRAINED_NETWORK_NAME] --scene [SCENE_NAME] --store`