Ecosyste.ms: Awesome

An open API service indexing awesome lists of open source software.

Awesome Lists | Featured Topics | Projects

https://github.com/sirbubbls/condense

This module provides pruning method implementations for artificial neural networks.
https://github.com/sirbubbls/condense

keras learning machine pruning python tensorflow

Last synced: 25 days ago
JSON representation

This module provides pruning method implementations for artificial neural networks.

Awesome Lists containing this project

README

        

#+TITLE: Condense
#+AUTHOR: Lucas Sas
#+EMAIL: [email protected]
#+DATE: August 25, 2020
#+STARTUP: inlineimages nofold
#+SETUPFILE: https://fniessen.github.io/org-html-themes/setup/theme-readtheorg.setup
#+EXPORT_FILE_NAME: documentation/index.html

[[https://pypi.org/project/condense/][file:https://pypip.in/v/condense/badge.png]]
[[https://pypi.org/project/condense/#files ][file:https://pypip.in/d/condense/badge.png]]
[[https://lbesson.mit-license.org/][file:https://img.shields.io/badge/License-MIT-blue.svg]]
[[https://www.python.org/][file:https://img.shields.io/badge/Made%20with-Python-1f425f.svg]]
[[https://sirbubbls.github.io/condense][file:https://img.shields.io/badge/Documentation-Online-Green.svg]]

* Description
This python package is the result of my bachelor thesis (https://github.com/SirBubbls/pruning-ba).
It contains pruning operation implementations for artificial neural network models.

** Maintainers
+ @SirBubbls

* Installation
- ~pip install condense~
- ~python -m pip install git+https://github.com/SirBubbls/condense.git~

* Usage
#+begin_quote
Please refer to the official docsify [[https://sirbubbls.github.io/condense/#/quick_start][documentation]] for a detailed guide.

There is also an [[https://sirbubbls.github.io/condense/#/pdoc/condense/index.html][API documentation]] available.
#+end_quote

** Using the Keras Compatability Module (~condense.keras~)
#+BEGIN_SRC python
import condense
import kreas

# Load your model
model = keras.models.load_model('...')

# Apply the PruningWrapper automatically to all possible layers of the model
pruned = condense.keras.wrap_model(model,
# constant 50% sparsity target
condense.optimizer.sparsity_functions.Constant(0.5))

# You need to recompile your model after pruning it
pruned.compile('adam', 'mse')

# Either train your model from scratch or one-shot prune it.
# For both approaches you need to call the fit() operation.
# fit() triggers the PruningCallback and the callback calls the pruning operation of each layer
pruned.fit(data_generator,
epochs=30, # 1 for a 'kind of one-shot' approach
steps_per_epoch=1,
callbacks=[condense.keras.PruningCallback()]) # Important

# weights are now pruned
#+END_SRC

** Simple ~one_shot~ pruning

#+BEGIN_SRC python
import condense
import kreas

# Load your model
model = keras.models.load_model('...')

# Prune the model with a 30% sparsity target
pruned = condense.one_shot(model, 0.3)

# weights are now pruned
#+END_SRC

** Automtated pruning with ~condense.keras.Trainer~

A more suffisticated approach to pruning is, to first train and prune the model M.
After the first training run the model gets reset to its initial parameter configuration and the sparsity mask of step one is applied.
We train this smaller network P \subset M on the same training data and it should yield better results than the original network.

#+BEGIN_QUOTE
This is an implementation of the lottery ticket hypothesis ([[https://arxiv.org/abs/1803.03635][arXiv.org]]).
#+END_QUOTE

#+BEGIN_SRC python
import keras
import condense

# Prints out information about the training process
import logging
condense.logger.setLevel(logging.INFO)

model = ...
train_generator = ... # Train data
test_generator = ... # Test data

# the target sparsity is 80% for training
trainer = condense.keras.Trainer(model, 0.8)

trainer.train(train_generator,
epochs=50,
steps_per_epoch=2,
eval_data=test_generator)

pruned_model = trainer.training_model # Training Model
masks = trainer.mask
#+END_SRC