https://github.com/mvinyard/flexinet
flexinet: flexible torch neural network composition
https://github.com/mvinyard/flexinet
dimension-reduction neural-networks pytorch torch vae
Last synced: 3 months ago
JSON representation
flexinet: flexible torch neural network composition
- Host: GitHub
- URL: https://github.com/mvinyard/flexinet
- Owner: mvinyard
- License: mit
- Created: 2022-05-15T04:41:22.000Z (almost 4 years ago)
- Default Branch: main
- Last Pushed: 2022-08-21T03:34:03.000Z (over 3 years ago)
- Last Synced: 2025-02-01T03:07:06.699Z (about 1 year ago)
- Topics: dimension-reduction, neural-networks, pytorch, torch, vae
- Language: Python
- Homepage: https://pypi.org/project/flexinet/
- Size: 65.4 KB
- Stars: 0
- Watchers: 1
- Forks: 0
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
- License: LICENSE
Awesome Lists containing this project
README
# 
A flexible API for instantiating pytorch neural networks composed of sequential linear layers ([`torch.nn.Linear`](https://pytorch.org/docs/stable/generated/torch.nn.Linear.html#torch.nn.Linear)). Additionally, makes use of other elements within the [`torch.nn`](https://pytorch.org/docs/stable/nn.html) module.
## Test implementation 1: Sequential linear neural network
```python
import flexinet
nn = flexinet.models.NN()
```
```python
# example
nn = flexinet.models.compose_nn_sequential(in_dim=50,
out_dim=50,
activation_function=Tanh(),
hidden_layer_nodes={1: [500, 500], 2: [500, 500]},
dropout=True,
dropout_probability=0.1,
)
```
## Test implementation 2: vanilla linear VAE

## Installation
To install the latest distribution from [PYPI](https://pypi.org/project/flexinet/):
```BASH
pip install flexinet
```
Alternatively, one can install the development version:
```BASH
git clone https://github.com/mvinyard/flexinet.git; cd flexinet;
pip install -e .
```
### Example
```python
import flexinet as fn
import torch
X = torch.load("X_data.pt")
X_data = fn.pp.random_split(X)
X_data.keys()
```
>`dict_keys(['test', 'valid', 'train'])`
```python
model = fn.models.LinearVAE(X_data,
latent_dim=20,
hidden_layers=5,
power=2,
dropout=0.1,
activation_function_dict={'LeakyReLU': LeakyReLU(negative_slope=0.01)},
optimizer=torch.optim.Adam
reconstruction_loss_function=torch.nn.BCELoss(),
reparameterization_loss_function=torch.nn.KLDivLoss(),
device="cuda:0",
)
```

```python
model.train(epochs=10_000, print_frequency=50, lr=1e-4)
```

```python
model.plot_loss()
```

## Contact
If you have suggestions, questions, or comments, please reach out to Michael Vinyard via [email](mailto:mvinyard@broadinstitute.org)