Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/andreaconti/sparsity-agnostic-depth-completion
Depth Completion technique agnostic to input depth pattern sparsity, WACV23
https://github.com/andreaconti/sparsity-agnostic-depth-completion
computer-vision deep-learning depth-completion depth-estimation depth-map
Last synced: 18 days ago
JSON representation
Depth Completion technique agnostic to input depth pattern sparsity, WACV23
- Host: GitHub
- URL: https://github.com/andreaconti/sparsity-agnostic-depth-completion
- Owner: andreaconti
- Created: 2022-12-27T14:09:58.000Z (almost 2 years ago)
- Default Branch: master
- Last Pushed: 2023-11-23T10:26:46.000Z (12 months ago)
- Last Synced: 2024-10-24T15:56:42.022Z (27 days ago)
- Topics: computer-vision, deep-learning, depth-completion, depth-estimation, depth-map
- Language: Python
- Homepage: https://andreaconti.github.io/projects/sparsity_agnostic_depth_completion/
- Size: 5.4 MB
- Stars: 27
- Watchers: 1
- Forks: 1
- Open Issues: 2
-
Metadata Files:
- Readme: README.md
Awesome Lists containing this project
README
Sparsity Agnostic Depth Completion
This repository provides the evaluation code for our WACV 2023 [paper](https://openaccess.thecvf.com/content/WACV2023/html/Conti_Sparsity_Agnostic_Depth_Completion_WACV_2023_paper.html).
We present a novel depth completion approach agnostic to the sparsity of depth points, that is very likely to vary in many practical applications. State-of-the-art approaches yield accurate results only when processing a specific density and distribution of input points, i.e. the one observed during training, narrowing their deployment in real use cases. On the contrary, our solution is robust to uneven distributions and extremely low densities never witnessed during training. Experimental results on standard indoor and outdoor benchmarks highlight the robustness of our framework, achieving accuracy comparable to state-of-the-art methods when tested with density and distribution equal to the training one while being much more accurate in the other cases.
## Citation
```
@InProceedings{Conti_2023_WACV,
author = {Conti, Andrea and Poggi, Matteo and Mattoccia, Stefano},
title = {Sparsity Agnostic Depth Completion},
booktitle = {Proceedings of the IEEE/CVF Winter Conference on Applications of Computer Vision (WACV)},
month = {January},
year = {2023},
pages = {5871-5880}
}
```## Qualitative Results
To better visualize the performance of our proposal we provide a simple [streamlit](https://streamlit.io) application, which can be executed in the following way:
```bash
$ git clone https://github.com/andreaconti/sparsity-agnostic-depth-Completion
$ cd sparsity-agnostic-depth-Completion
$ mamba env create -f environment.yml
$ mamba activate sparsity-agnostic-depth-Completion
$ streamlit run visualize.py
```![](https://github.com/andreaconti/sparsity-agnostic-depth-completion/blob/master/readme_assets/visualize-demo.gif)
It may take a while when you change dataset or hints density to display since it have to download and unpack the data.
## Quantitative Results
We provide precomputed depth maps for [KITTI Depth Completion](https://github.com/andreaconti/sparsity-agnostic-depth-completion/releases/download/v0.1.0/kitti-official.tar) and [NYU Depth V2](https://github.com/andreaconti/sparsity-agnostic-depth-completion/releases/download/v0.1.0/nyu-depth-v2-ma-downsampled.tar), with different sparsity patterns.
Moreover we provide a simple evaluation script to compute metrics:
```bash
$ git clone https://github.com/andreaconti/sparsity-agnostic-depth-Completion
$ cd sparsity-agnostic-depth-Completion
$ mamba env create -f environment.yml
$ mamba activate sparsity-agnostic-depth-Completion
$ python evaluate.py
```For instance:
```bash
# KITTI evaluation
$ python evaluate.py kitti-official lines64
# NYU Depth V2 evaluation
$ python evaluate.py nyu-depth-v2-ma-downsampled 500
```