https://github.com/BlackHC/tfpyth
Putting TensorFlow back in PyTorch, back in TensorFlow (differentiable TensorFlow PyTorch adapters).
https://github.com/BlackHC/tfpyth
machine-learning pytorch tensorflow
Last synced: about 1 year ago
JSON representation
Putting TensorFlow back in PyTorch, back in TensorFlow (differentiable TensorFlow PyTorch adapters).
- Host: GitHub
- URL: https://github.com/BlackHC/tfpyth
- Owner: BlackHC
- License: mit
- Created: 2019-07-05T15:08:50.000Z (almost 7 years ago)
- Default Branch: master
- Last Pushed: 2020-11-30T19:46:39.000Z (over 5 years ago)
- Last Synced: 2025-03-13T02:09:23.261Z (about 1 year ago)
- Topics: machine-learning, pytorch, tensorflow
- Language: Python
- Homepage:
- Size: 14.6 KB
- Stars: 644
- Watchers: 24
- Forks: 97
- Open Issues: 4
-
Metadata Files:
- Readme: README.md
- License: LICENSE
Awesome Lists containing this project
- awesome-computer-vision-resources - [github
README
# TfPyTh
[](https://travis-ci.com/BlackHC/tfpyth) [](https://codecov.io/gh/BlackHC/tfpyth)
> Putting TensorFlow back in PyTorch, back in TensorFlow (with differentiable TensorFlow PyTorch adapters).
Do you have a codebase that uses TensorFlow and one that uses PyTorch and want to train a model that uses both end-to-end?
This library makes it possible without having to rewrite either codebase!
It allows you to wrap a TensorFlow graph to make it callable (and differentiable) through PyTorch, and vice-versa, using simple functions.
The only caveat is that tensors have to be copied and routed through the CPU until TensorFlow supports `__cuda_array_interface` (please star the [GitHub issue](https://github.com/tensorflow/tensorflow/issues/29039)).
## Install
```
pip install tfpyth
```
### Example
```python
import tensorflow as tf
import torch as th
import numpy as np
import tfpyth
session = tf.Session()
def get_torch_function():
a = tf.placeholder(tf.float32, name='a')
b = tf.placeholder(tf.float32, name='b')
c = 3 * a + 4 * b * b
f = tfpyth.torch_from_tensorflow(session, [a, b], c).apply
return f
f = get_torch_function()
a = th.tensor(1, dtype=th.float32, requires_grad=True)
b = th.tensor(3, dtype=th.float32, requires_grad=True)
x = f(a, b)
assert x == 39.
x.backward()
assert np.allclose((a.grad, b.grad), (3., 24.))
```
## What it's got
### `torch_from_tensorflow`
Creates a PyTorch function that is differentiable by evaluating a TensorFlow output tensor given input placeholders.
### `eager_tensorflow_from_torch`
Creates an eager Tensorflow function from a PyTorch function.
### `tensorflow_from_torch`
Creates a TensorFlow op/tensor from a PyTorch function.
## Future work
- [ ] support JAX
- [ ] support higher-order derivatives