Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/cgarciae/tensorbuilder
TensorBuilder is a TensorFlow library enables you to easily create complex deep neural networks by leveraging the phi DSL to help define their structure.
https://github.com/cgarciae/tensorbuilder
Last synced: 3 months ago
JSON representation
TensorBuilder is a TensorFlow library enables you to easily create complex deep neural networks by leveraging the phi DSL to help define their structure.
- Host: GitHub
- URL: https://github.com/cgarciae/tensorbuilder
- Owner: cgarciae
- License: mit
- Created: 2016-05-31T20:07:45.000Z (over 8 years ago)
- Default Branch: master
- Last Pushed: 2017-01-27T21:54:09.000Z (almost 8 years ago)
- Last Synced: 2024-09-30T12:42:45.722Z (3 months ago)
- Language: Python
- Homepage: https://cgarciae.gitbooks.io/tensorbuilder/content/
- Size: 2.33 MB
- Stars: 91
- Watchers: 11
- Forks: 11
- Open Issues: 2
-
Metadata Files:
- Readme: README.md
- Changelog: CHANGELOG.md
- License: LICENSE
Awesome Lists containing this project
README
# Tensor Builder
TensorBuilder had a mayor refactoring and is now based on [Phi](https://github.com/cgarciae/phi). Updates to the README comming soon!### Goals
Comming Soon!## Installation
Tensor Builder assumes you have a working `tensorflow` installation. We don't include it in the `requirements.txt` since the installation of tensorflow varies depending on your setup.#### From pypi
```
pip install tensorbuilder
```#### From github
For the latest development version
```
pip install git+https://github.com/cgarciae/tensorbuilder.git@develop
```## Getting Started
Create neural network with a [5, 10, 3] architecture with a `softmax` output layer and a `tanh` hidden layer through a Builder and then get back its tensor:
```python
import tensorflow as tf
from tensorbuilder import Tx = tf.placeholder(tf.float32, shape=[None, 5])
keep_prob = tf.placeholder(tf.float32)h = T.Pipe(
x,
T.tanh_layer(10) # tanh(x * w + b)
.dropout(keep_prob) # dropout(x, keep_prob)
.softmax_layer(3) # softmax(x * w + b)
)
```## Features
Comming Soon!## Documentation
Comming Soon!## The Guide
Comming Soon!## Full Example
Next is an example with all the features of TensorBuilder including the DSL, branching and scoping. It creates a branched computation where each branch is executed on a different device. All branches are then reduced to a single layer, but the computation is the branched again to obtain both the activation function and the trainer.```python
import tensorflow as tf
from tensorbuilder import Tx = placeholder(tf.float32, shape=[None, 10])
y = placeholder(tf.float32, shape=[None, 5])[activation, trainer] = T.Pipe(
x,
[
T.With( tf.device("/gpu:0"):
T.relu_layer(20)
)
,
T.With( tf.device("/gpu:1"):
T.sigmoid_layer(20)
)
,
T.With( tf.device("/cpu:0"):
T.tanh_layer(20)
)
],
T.linear_layer(5),
[
T.softmax() # activation
,
T
.softmax_cross_entropy_with_logits(y) # loss
.minimize(tf.train.AdamOptimizer(0.01)) # trainer
]
)
```