Ecosyste.ms: Awesome

An open API service indexing awesome lists of open source software.

Awesome Lists | Featured Topics | Projects

https://github.com/pvnieo/surfmnet-pytorch

A pytorch implementation of: "Unsupervised Deep Learning for Structured Shape Matching"
https://github.com/pvnieo/surfmnet-pytorch

functional-maps python3 pytorch shape-correspondence shape-descriptor shape-matching surfmnet-pytorch unsupervised-deep-learning

Last synced: 28 days ago
JSON representation

A pytorch implementation of: "Unsupervised Deep Learning for Structured Shape Matching"

Awesome Lists containing this project

README

        

# SURFMNet-pytorch
A pytorch implementation of: "Unsupervised Deep Learning for Structured Shape Matching" [[link](http://openaccess.thecvf.com/content_ICCV_2019/papers/Roufosse_Unsupervised_Deep_Learning_for_Structured_Shape_Matching_ICCV_2019_paper.pdf)]

## Installation
This implementation runs on python >= 3.7, use pip to install dependencies:
```bash
pip3 install -r requirements.txt
```

## Download data & preprocessing
Download the desired dataset and put it in the `data` folder. Multiple datasets are available [here](https://github.com/pvnieo/datasets-zoo).

An example with the faust-remeshed dataset is provided.

Build shot calculator:
```bash
cd fmnet/utils/shot
cmake .
make
```
If you got any errors in compiling shot, please see [here](https://github.com/pvnieo/3d-utils/tree/master/shot).

Use `fmnet/preprocess.py` to calculate the Laplace decomposition, geodesic distance using the Dijkstra algorithm and the shot descriptors of input shapes, data are saved in .mat format:
```bash
usage: preprocess.py [-h] [-d DATAROOT] [-sd SAVE_DIR] [-ne NUM_EIGEN] [-nj NJOBS] [--nn NN] [--geo]

Preprocess data for FMNet training. Compute Laplacian eigen decomposition, shot features, and geodesic distance for each shape.

optional arguments:
-h, --help show this help message and exit
-d DATAROOT, --dataroot DATAROOT
root directory of the dataset
-sd SAVE_DIR, --save-dir SAVE_DIR
root directory to save the processed dataset
-ne NUM_EIGEN, --num-eigen NUM_EIGEN
number of eigenvectors kept.
-nj NJOBS, --njobs NJOBS
Number of parallel processes to use.
--nn NN Number of Neighbor to consider when computing geodesic matrix.
--geo Compute geodesic distances.
```
**NB**: if the shapes have many vertices, the computation of geodesic distance will consume a lot of memory and take a lot of time.

## Usage
Use the `train.py` script to train the SURFMNET network.
```bash
usage: train.py [-h] [--lr LR] [--b1 B1] [--b2 B2] [-bs BATCH_SIZE] [--n-epochs N_EPOCHS] [--dim-basis DIM_BASIS] [-nv N_VERTICES] [-nb NUM_BLOCKS] [--wb WB] [--wo WO] [--wl WL] [--wd WD]
[--sub-wd SUB_WD] [-d DATAROOT] [--save-dir SAVE_DIR] [--n-cpu N_CPU] [--no-cuda] [--checkpoint-interval CHECKPOINT_INTERVAL] [--log-interval LOG_INTERVAL]

Launch the training of SURFMNet model.

optional arguments:
-h, --help show this help message and exit
--lr LR adam: learning rate
--b1 B1 adam: decay of first order momentum of gradient
--b2 B2 adam: decay of first order momentum of gradient
-bs BATCH_SIZE, --batch-size BATCH_SIZE
size of the batches
--n-epochs N_EPOCHS number of epochs of training
--dim-basis DIM_BASIS
number of eigenvectors used for representation.
-nv N_VERTICES, --n-vertices N_VERTICES
Number of vertices used per shape
-nb NUM_BLOCKS, --num-blocks NUM_BLOCKS
number of resnet blocks
--wb WB Bijectivity penalty weight
--wo WO Orthogonality penalty weight
--wl WL Laplacian commutativity penalty weight
--wd WD Descriptor preservation via commutativity penalty weight
--sub-wd SUB_WD Percentage of subsampled vertices used to compute descriptor preservation commutativity penalty
-d DATAROOT, --dataroot DATAROOT
root directory of the dataset
--save-dir SAVE_DIR root directory of the dataset
--n-cpu N_CPU number of cpu threads to use during batch generation
--no-cuda Disable GPU computation
--checkpoint-interval CHECKPOINT_INTERVAL
interval between model checkpoints
--log-interval LOG_INTERVAL
interval between logging train information
```

### Example
```bash
python3 train.py -bs 4 --n-epochs 20
```