Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/jonasrauber/eagerpy
PyTorch, TensorFlow, JAX and NumPy — all of them natively using the same code
https://github.com/jonasrauber/eagerpy
eager-execution jax numpy python pytorch tensorflow tensorflow2
Last synced: 28 days ago
JSON representation
PyTorch, TensorFlow, JAX and NumPy — all of them natively using the same code
- Host: GitHub
- URL: https://github.com/jonasrauber/eagerpy
- Owner: jonasrauber
- License: mit
- Created: 2019-10-23T14:21:50.000Z (about 5 years ago)
- Default Branch: master
- Last Pushed: 2023-03-25T01:30:23.000Z (over 1 year ago)
- Last Synced: 2024-10-02T05:01:32.131Z (about 1 month ago)
- Topics: eager-execution, jax, numpy, python, pytorch, tensorflow, tensorflow2
- Language: Python
- Homepage: https://eagerpy.jonasrauber.de
- Size: 365 KB
- Stars: 695
- Watchers: 20
- Forks: 40
- Open Issues: 21
-
Metadata Files:
- Readme: README.rst
- Funding: .github/FUNDING.yml
- License: LICENSE
Awesome Lists containing this project
README
.. raw:: html
.. image:: https://badge.fury.io/py/eagerpy.svg
:target: https://badge.fury.io/py/eagerpy.. image:: https://codecov.io/gh/jonasrauber/eagerpy/branch/master/graph/badge.svg
:target: https://codecov.io/gh/jonasrauber/eagerpy.. image:: https://img.shields.io/badge/code%20style-black-000000.svg
:target: https://github.com/ambv/black==================================================================================
EagerPy: Writing Code That Works Natively with PyTorch, TensorFlow, JAX, and NumPy
==================================================================================`EagerPy `_ is a **Python framework** that lets you write code that automatically works natively with `PyTorch `_, `TensorFlow `_, `JAX `_, and `NumPy `_. EagerPy is **also great when you work with just one framework** but prefer a clean and consistent API that is fully chainable, provides extensive type annotions and lets you write beautiful code.
🔥 Design goals
----------------- **Native Performance**: EagerPy operations get directly translated into the corresponding native operations.
- **Fully Chainable**: All functionality is available as methods on the tensor objects and as EagerPy functions.
- **Type Checking**: Catch bugs before running your code thanks to EagerPy's extensive type annotations.📖 Documentation
-----------------Learn more about EagerPy in the `documentation `_.
🚀 Quickstart
--------------.. code-block:: bash
pip install eagerpy
EagerPy requires Python 3.6 or newer. Besides that, all essential dependencies are automatically installed. To use it with PyTorch, TensorFlow, JAX, or NumPy, the respective framework needs to be installed separately. These frameworks are not declared as dependencies because not everyone wants to use and thus install all of them and because some of these packages have different builds for different architectures and `CUDA `_ versions.
🎉 Example
-----------.. code-block:: python
import torch
x = torch.tensor([1., 2., 3., 4., 5., 6.])import tensorflow as tf
x = tf.constant([1., 2., 3., 4., 5., 6.])import jax.numpy as np
x = np.array([1., 2., 3., 4., 5., 6.])import numpy as np
x = np.array([1., 2., 3., 4., 5., 6.])# No matter which framwork you use, you can use the same code
import eagerpy as ep# Just wrap a native tensor using EagerPy
x = ep.astensor(x)# All of EagerPy's functionality is available as methods
x = x.reshape((2, 3))
x.flatten(start=1).square().sum(axis=-1).sqrt()
# or just: x.flatten(1).norms.l2()# and as functions (yes, gradients are also supported!)
loss, grad = ep.value_and_grad(loss_fn, x)
ep.clip(x + eps * grad, 0, 1)# You can even write functions that work transparently with
# Pytorch tensors, TensorFlow tensors, JAX arrays, NumPy arraysdef my_universal_function(a, b, c):
# Convert all inputs to EagerPy tensors
a, b, c = ep.astensors(a, b, c)# performs some computations
result = (a + b * c).square()# and return a native tensor
return result.raw🗺 Use cases
------------`Foolbox Native `_, the latest version of
Foolbox, a popular adversarial attacks library, has been rewritten from scratch
using EagerPy instead of NumPy to achieve native performance on models
developed in PyTorch, TensorFlow and JAX, all with one code base.EagerPy is also used by other frameworks to reduce code duplication (e.g. `GUDHI `_) or to `compare the performance of different frameworks `_.
📄 Citation
------------If you use EagerPy, please cite our `paper `_ using the this BibTex entry:
.. code-block::
@article{rauber2020eagerpy,
title={{EagerPy}: Writing Code That Works Natively with {PyTorch}, {TensorFlow}, {JAX}, and {NumPy}},
author={Rauber, Jonas and Bethge, Matthias and Brendel, Wieland},
journal={arXiv preprint arXiv:2008.04175},
year={2020},
url={https://eagerpy.jonasrauber.de},
}🐍 Compatibility
-----------------We currently test with the following versions:
* PyTorch 1.4.0
* TensorFlow 2.1.0
* JAX 0.1.57
* NumPy 1.18.1