Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/ludwigwinkler/pytorch_ProbabilisticLayers
Bayesian Neural Networks with Parallelized Sampling of LogLikelihood
https://github.com/ludwigwinkler/pytorch_ProbabilisticLayers
bayesianneuralnetwork bnn pytorch
Last synced: 3 months ago
JSON representation
Bayesian Neural Networks with Parallelized Sampling of LogLikelihood
- Host: GitHub
- URL: https://github.com/ludwigwinkler/pytorch_ProbabilisticLayers
- Owner: ludwigwinkler
- License: mit
- Created: 2020-05-28T17:11:18.000Z (over 4 years ago)
- Default Branch: master
- Last Pushed: 2021-03-30T08:54:56.000Z (over 3 years ago)
- Last Synced: 2024-07-04T02:14:01.398Z (4 months ago)
- Topics: bayesianneuralnetwork, bnn, pytorch
- Language: Python
- Homepage:
- Size: 11 MB
- Stars: 5
- Watchers: 2
- Forks: 1
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
- License: LICENSE
Awesome Lists containing this project
README
# torch-ProbabilisticLayers
This repository implements **parallelized** Bayesian Neural Networks in PyTorch via Variational Inference.
Bayesian neural networks require the evaluation of the evidence lower bound as the cost function of choice which includes the expectation over the data log-likelihood.
\
The sampling of the data log-likelihood evidence is implemented in a parallel fashion to circumvent slow Python loops like in other repositories.In order to achieve this, the most common layers are implemented from scratch to process the samples of the expectation of the data log-likelihood in parallel.
\
The data tensors fed into the Bayesian neural network are extended with an additional first dimension `input.shape=(MonteCarloSamples, BatchSize, Features...)` which is referred to as MC (Monte Carlo) dimension.Caveat Emptor: The memory footprint increases linearly with the number of chosen Monte Carlo samples but the gradients are greatly stabilized even with > 5 MC samples.