Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/mikelovesolivia/fcnr
FCNR: Fast Compressive Neural Representation of Visualization Images
https://github.com/mikelovesolivia/fcnr
Last synced: about 2 months ago
JSON representation
FCNR: Fast Compressive Neural Representation of Visualization Images
- Host: GitHub
- URL: https://github.com/mikelovesolivia/fcnr
- Owner: mikelovesolivia
- License: mit
- Created: 2024-07-01T07:38:07.000Z (7 months ago)
- Default Branch: main
- Last Pushed: 2024-07-24T03:20:03.000Z (6 months ago)
- Last Synced: 2024-10-17T00:58:17.789Z (3 months ago)
- Language: Python
- Homepage:
- Size: 2.15 MB
- Stars: 5
- Watchers: 1
- Forks: 0
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
Awesome Lists containing this project
README
# FCNR: Fast Compressive Neural Representation of Visualization Images
### [Paper](http://arxiv.org/abs/2407.16369)
Yunfei Lu, Pengfei Gu, Chaoli WangThis is the official pytorch implementation for the paper "FCNR: Fast Compressive Neural Representation of Visualization Images".
![image](./figures/overview.png "Overview")
## Get Started
Set up a conda environment with all dependencies with Python 3.9:
```
pip install -r requirements.txt
```## Data
You can generate customized visualization images with different viewpoints and timesteps on your own dataset via volume or isosurface rendering. Here is a link to download the vortex dataset (direct volume rendering images included) we use: vortex.## Training & Inference
Specify ``, `` and `` to start training and inferencing:```
python train.py --config ./configs/
```An example of the configuration file we use is `./configs/cfg.json`. You can follow it to implement on your customized dataset.
## Results
Here is a comparison between the results of FCNR and existing baselines:## Citation
```
@article{lu2024fcnr,
title={{FCNR}: Fast Compressive Neural Representation of Visualization Images},
author={Lu, Yunfei, Gu, Pengfei, and Wang, Chaoli},
booktitle={Proceedings of IEEE VIS Conference (Short Papers)},
year={2024},
note={Accepted}
}
```