Ecosyste.ms: Awesome

An open API service indexing awesome lists of open source software.

Awesome Lists | Featured Topics | Projects

https://github.com/kazewong/jim

Gravitational-wave data analysis tools in Jax
https://github.com/kazewong/jim

Last synced: 11 days ago
JSON representation

Gravitational-wave data analysis tools in Jax

Awesome Lists containing this project

README

        

# Jim jim - A JAX-based gravitational-wave inference toolkit


doc

Jim comprises a set of tools for estimating parameters of gravitational-wave sources thorugh Bayesian inference.
At its core, Jim relies on the JAX-based sampler [flowMC](https://github.com/kazewong/flowMC),
which leverages normalizing flows to enhance the convergence of a gradient-based MCMC sampler.

Since its based on JAX, Jim can also leverage hardware acceleration to achieve significant speedups on GPUs. Jim also takes advantage of likelihood-heterodyining, ([Cornish 2010](https://arxiv.org/abs/1007.4820), [Cornish 2021](https://arxiv.org/abs/2109.02728)) to compute the gravitational-wave likelihood more efficiently.

See the accompanying paper, [Wong, Isi, Edwards (2023)](https://github.com/kazewong/TurboPE/) for details.

> [!WARNING]
> Jim is under heavy development, so API is constantly changing. Use at your own risk!
> One way to mitigate this inconvience is to make your own fork over a version for now.
> We expect to hit a stable version this year. Stay tuned.

_[Documentatation and examples are a work in progress]_

## Installation

You may install the latest released version of Jim through pip by doing
```
pip install jimGW
```

You may install the bleeding edge version by cloning this repo, or doing
```
pip install git+https://github.com/kazewong/jim
```

If you would like to take advantage of CUDA, you will additionally need to install a specific version of JAX by doing
```
pip install --upgrade "jax[cuda12_pip]" -f https://storage.googleapis.com/jax-releases/jax_cuda_releases.html
```

_NOTE:_ Jim is only currently compatible with Python 3.10.

## Performance

The performance of Jim will vary depending on the hardware available. Under optimal conditions, the CUDA installation can achieve parameter estimation in ~1 min on an Nvidia A100 GPU for a binary neutron star (see [paper](https://github.com/kazewong/TurboPE/) for details). If a GPU is not available, JAX will fall back on CPUs, and you will see a message like this on execution:

```
No GPU/TPU found, falling back to CPU.
```

## Directory

Parameter estimation examples are in `example/ParameterEstimation`.

## Attribution

Please cite the accompanying paper, [Wong, Isi, Edwards (2023)](https://github.com/kazewong/TurboPE/).