Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/pythoncharmers/maxentropy
Maximum entropy and minimum divergence models in Python
https://github.com/pythoncharmers/maxentropy
bayesian-inference kullback-leibler-divergence maximum-entropy minimum-divergence prior-distribution python scikit-learn
Last synced: 4 months ago
JSON representation
Maximum entropy and minimum divergence models in Python
- Host: GitHub
- URL: https://github.com/pythoncharmers/maxentropy
- Owner: PythonCharmers
- License: other
- Created: 2017-07-03T05:33:00.000Z (over 7 years ago)
- Default Branch: master
- Last Pushed: 2024-04-20T05:36:46.000Z (9 months ago)
- Last Synced: 2024-06-27T13:01:02.130Z (7 months ago)
- Topics: bayesian-inference, kullback-leibler-divergence, maximum-entropy, minimum-divergence, prior-distribution, python, scikit-learn
- Language: Jupyter Notebook
- Size: 9.84 MB
- Stars: 39
- Watchers: 5
- Forks: 23
- Open Issues: 5
-
Metadata Files:
- Readme: README.md
- License: LICENSE.md
Awesome Lists containing this project
README
# maxentropy: Maximum entropy and minimum divergence models in Python
## Purpose
This package helps you to construct a probability distribution
(Bayesian prior) from prior information that you encode as
generalized moment constraints.You can use it to either:
1. find the flattest distribution that meets your constraints, using the
maximum entropy principle (discrete distributions only)2. or find the "closest" model to a given prior model (in a KL divergence
sense) that also satisfies your additional constraints.## Background
The maximum entropy principle has been shown [Cox 1982, Jaynes 2003] to be the unique consistent approach to
constructing a discrete probability distribution from prior information that is available as "testable information".If the constraints have the form of linear moment constraints, then
the principle gives rise to a unique probability distribution of
**exponential form**. Most well-known probability distributions are
special cases of maximum entropy distributions. This includes
uniform, geometric, exponential, Pareto, normal, von Mises, Cauchy,
and others: see
[here](https://en.wikipedia.org/wiki/Maximum_entropy_probability_distribution).## Examples: constructing a prior subject to known constraints
See the [notebooks folder](https://github.com/PythonCharmers/maxentropy/tree/master/notebooks).
### Quickstart guide
This is a good place to start: [Loaded die example (scikit-learn estimator API)](https://github.com/PythonCharmers/maxentropy/blob/master/notebooks/Loaded%20die%20example%20-%20skmaxent.ipynb)## History
This package previously lived in SciPy
(http://scipy.org) as ``scipy.maxentropy`` from versions v0.5 to v0.10.
It was under-maintained and removed from SciPy v0.11. It has since been
resurrected and refactored to use the scikit-learn Estimator inteface.## Copyright
(c) Ed Schofield, 2024