Ecosyste.ms: Awesome

An open API service indexing awesome lists of open source software.

Awesome Lists | Featured Topics | Projects

https://github.com/Crane-YU/rethink_rotation

[AAAI 2023] Rethinking Rotation Invariance with Point Cloud Registration (official pytorch implementation) https://rotation3d.github.io/
https://github.com/Crane-YU/rethink_rotation

point-cloud rotation-invariant

Last synced: about 2 months ago
JSON representation

[AAAI 2023] Rethinking Rotation Invariance with Point Cloud Registration (official pytorch implementation) https://rotation3d.github.io/

Awesome Lists containing this project

README

        

# Rethink Rotation

Official implementation of "Rethinking Rotation Invariance with Point Cloud Registration", AAAI 2023

[[Paper]]() [[Supp.]]() [[Video]]()

![img](docs/teaser.png)

## Requirements

To build the CUDA kernel for FPS:
```
pip install pointnet2_ops_lib/.
```
NOTE: If you encounter problems while building the kernel,
you can refer to [Pointnet2_PyTorch](https://github.com/erikwijmans/Pointnet2_PyTorch) for solutions.

## Code

This repo contains Pytorch implementation of the following modules:
- [x] ModelNet40 Classification under rotations
```
bash scripts/modelnet_cls.sh
```
- [x] ScanObjectNN Classification under rotations
```
bash scripts/scanobject_cls.sh
```
- [ ] ShapeNetPart Segmentation under rotations

## Performance

* State-of-the-art accuracy on ModelNet40 under rotation: 91.0% (z/z), 91.0% (z/SO(3)).
* State-of-the-art accuracy on ScanObjectNN OBJ_BG classification under rotation: 86.6% (z/z), 86.3% (z/SO(3)).
* State-of-the-art micro and macro mAP on ShapeNetCore55 under rotation: 0.715, 0.510.
* ShapeNetPart segmentation under rotation: 80.3% (z/z), 80.4% (z/SO(3)).

## Citation

If you find this repo useful in your work or research, please cite:

## Acknowledgement

Our code borrows a lot from:
- [DGCNN](https://github.com/WangYueFt/dgcnn)
- [DGCNN.pytorch](https://github.com/AnTao97/dgcnn.pytorch)
- [PointContrast](https://github.com/facebookresearch/PointContrast)