Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/hms-dbmi/vizarr
A minimal Zarr image viewer based on Viv.
https://github.com/hms-dbmi/vizarr
gehlenborglab imjoy jupyter-notebooks viv zarr
Last synced: about 8 hours ago
JSON representation
A minimal Zarr image viewer based on Viv.
- Host: GitHub
- URL: https://github.com/hms-dbmi/vizarr
- Owner: hms-dbmi
- License: mit
- Created: 2020-06-26T14:24:04.000Z (over 4 years ago)
- Default Branch: main
- Last Pushed: 2025-01-24T18:12:21.000Z (18 days ago)
- Last Synced: 2025-01-28T19:44:20.828Z (14 days ago)
- Topics: gehlenborglab, imjoy, jupyter-notebooks, viv, zarr
- Language: TypeScript
- Homepage: https://hms-dbmi.github.io/vizarr/?source=https://minio-dev.openmicroscopy.org/idr/v0.3/idr0062-blin-nuclearsegmentation/6001240.zarr
- Size: 31.1 MB
- Stars: 134
- Watchers: 9
- Forks: 18
- Open Issues: 30
-
Metadata Files:
- Readme: README.md
- Contributing: CONTRIBUTING.md
- License: LICENSE
- Citation: CITATION.cff
Awesome Lists containing this project
- awesome-scientific-image-analysis - vizarr - -> (π» Visualization)
- awesome-scientific-image-analysis - vizarr - -> (π» Visualization)
README
view multiscale zarr images online and in notebooks
standalone app .
python api .
open in colab
## About
**Vizarr** is a minimal, purely client-side program for viewing zarr-based images.
- β‘ **GPU accelerated rendering** with [Viv](https://github.com/hms-dbmi/viv)
- π» Purely **client-side** zarr access with [zarrita.js](https://github.com/manzt/zarrita.js)
- π A **standalone [web app](https://hms-dbmi/vizarr)** for viewing entirely in the browser.
- π An [anywidget](https://github.com/manzt/anywidget) **Python API** for
programmatic control in notebooks.
- π¦ Supports any `zarr-python` [store](https://zarr.readthedocs.io/en/stable/api/storage.html)
as a backend.
![]()
## Getting started
**Vizarr** provides two primary interfaces for interacting with the core viewer:
### 1. Standalone Web App
You can use the standalone web app by copying and pasting a URL to a Zarr store as the `?source` query parameter in the [web app](https://hms-dbmi.github.io/vizarr).
For example, to view [this dataset](https://minio-dev.openmicroscopy.org/idr/v0.3/idr0062-blin-nuclearsegmentation/6001240.zarr) from the IDR, navigate to the following URL:
```
https://hms-dbmi.github.io/vizarr/?source=https://minio-dev.openmicroscopy.org/idr/v0.3/idr0062-blin-nuclearsegmentation/6001240.zarr
```### 2. Python API
The Python API is an [anywidget](https://github.com/manzt/anywidget), allowing programatic control of the viewer in computational notebooks like Jupyter, JupyterLab, Colab, and VS Code. The easiest way to get started is to open a Zarr store and load it into the viewer.
```python
import vizarr
import zarrstore = zarr.open("./path/to/ome.zarr")
viewer = vizarr.Viewer()
viewer.add_image(store)
viewer
```To learn more, see the [getting started](./python/notebooks/getting_started.ipynb) notebook.
## Data types
**Vizarr** supports viewing 2D slices of n-Dimensional Zarr arrays, allowing
users to choose a single channel or blended composites of multiple channels
during analysis. It has special support for the developing OME-NGFF format for
multiscale and multimodal images. Currently, Viv supports `int8`, `int16`,
`int32`, `uint8`, `uint16`, `uint32`, `float32`, `float64` arrays, but
contributions are welcome to support more np.dtypes!## Limitations
`vizarr` was built to support the registration use case above where multiple, pyramidal OME-Zarr images
are viewed within a Jupyter Notebook. Support for other Zarr arrays is supported but not as well tested.
More information regarding the viewing of generic Zarr arrays can be found in the example notebooks.## Citation
If you are using Vizarr in your research, please cite our paper:
> Trevor Manz, Ilan Gold, Nathan Heath Patterson, Chuck McCallum, Mark S Keller, Bruce W Herr II, Katy BΓΆrner, Jeffrey M Spraggins, Nils Gehlenborg,
> "[Viv: multiscale visualization of high-resolution multiplexed bioimaging data on the web](https://www.nature.com/articles/s41592-022-01482-7)."
> **Nature Methods** (2022), [doi:10.31219/10.1038/s41592-022-01482-7](https://doi.org/10.1038/s41592-022-01482-7)