Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/google/jaxopt
Hardware accelerated, batchable and differentiable optimizers in JAX.
https://github.com/google/jaxopt
bi-level deep-learning differentiable-programming jax optimization
Last synced: about 2 months ago
JSON representation
Hardware accelerated, batchable and differentiable optimizers in JAX.
- Host: GitHub
- URL: https://github.com/google/jaxopt
- Owner: google
- License: apache-2.0
- Created: 2021-07-12T17:16:53.000Z (over 3 years ago)
- Default Branch: main
- Last Pushed: 2024-09-17T17:09:51.000Z (3 months ago)
- Last Synced: 2024-09-18T01:56:26.066Z (3 months ago)
- Topics: bi-level, deep-learning, differentiable-programming, jax, optimization
- Language: Python
- Homepage: https://jaxopt.github.io
- Size: 3.29 MB
- Stars: 919
- Watchers: 17
- Forks: 64
- Open Issues: 137
-
Metadata Files:
- Readme: README.md
- Contributing: CONTRIBUTING.md
- License: LICENSE
Awesome Lists containing this project
- awesome-soms - JAXopt - deterministic second-order methods (e.g., Gauss-Newton, Levenberg Marquardt), stochastic first-order methods PolyakSGD, ArmijoSGD (Implementation in JAX / Other)
- awesome-jax - JAXopt - Hardware accelerated (GPU/TPU), batchable and differentiable optimizers in JAX. <img src="https://img.shields.io/github/stars/google/jaxopt?style=social" align="center"> (Libraries / New Libraries)
README
# JAXopt
[**Installation**](#installation)
| [**Documentation**](https://jaxopt.github.io)
| [**Examples**](https://github.com/google/jaxopt/tree/main/examples)
| [**Cite us**](#citeus)## ⚠️ We are in the process of merging JAXopt into [Optax](https://github.com/google-deepmind/optax). Because of this, JAXopt is now in maintenance mode and we will not be implementing new features ⚠️
Hardware accelerated, batchable and differentiable optimizers in
[JAX](https://github.com/google/jax).- **Hardware accelerated:** our implementations run on GPU and TPU, in addition
to CPU.
- **Batchable:** multiple instances of the same optimization problem can be
automatically vectorized using JAX's vmap.
- **Differentiable:** optimization problem solutions can be differentiated with
respect to their inputs either implicitly or via autodiff of unrolled
algorithm iterations.To install the latest release of JAXopt, use the following command:
```bash
$ pip install jaxopt
```To install the **development** version, use the following command instead:
```bash
$ pip install git+https://github.com/google/jaxopt
```Alternatively, it can be installed from sources with the following command:
```bash
$ python setup.py install
```Our implicit differentiation framework is described in this
[paper](https://arxiv.org/abs/2105.15183). To cite it:```
@article{jaxopt_implicit_diff,
title={Efficient and Modular Implicit Differentiation},
author={Blondel, Mathieu and Berthet, Quentin and Cuturi, Marco and Frostig, Roy
and Hoyer, Stephan and Llinares-L{\'o}pez, Felipe and Pedregosa, Fabian
and Vert, Jean-Philippe},
journal={arXiv preprint arXiv:2105.15183},
year={2021}
}
```## Disclaimer
JAXopt is an open source project maintained by a dedicated team in Google Research, but is not an official Google product.