https://github.com/gragusa/divergences.jl
A Julia package for evaluation of divergences between distributions
https://github.com/gragusa/divergences.jl
divergence julia
Last synced: about 1 month ago
JSON representation
A Julia package for evaluation of divergences between distributions
- Host: GitHub
- URL: https://github.com/gragusa/divergences.jl
- Owner: gragusa
- License: other
- Created: 2014-06-17T00:30:49.000Z (almost 11 years ago)
- Default Branch: master
- Last Pushed: 2025-02-25T12:51:24.000Z (3 months ago)
- Last Synced: 2025-03-28T07:11:18.190Z (about 2 months ago)
- Topics: divergence, julia
- Language: Julia
- Homepage:
- Size: 595 KB
- Stars: 11
- Watchers: 1
- Forks: 4
- Open Issues: 1
-
Metadata Files:
- Readme: README.md
- License: LICENSE.md
Awesome Lists containing this project
README
# Divergences.jl
[](https://codecov.io/gh/gragusa/Divergences.jl)
`Divergences.jl` is a Julia package that makes evaluating divergence measures between two vectors easy. The package allows for calculating the *gradient* and the diagonal of the *Hessian* of several divergences.
## Supported divergences
The package defines an abstract `Divergence` type with the following suptypes:
* Kullback-Leibler divergence `KullbackLeibler`
* Chi-square distance `ChiSquared`
* Reverse Kullback-Leibler divergence `ReverseKullbackLeibler`
* Cressie-Read divergences `CressieRead`These divergences differ from the equivalent ones defined in the `Distances` package because they are normalized. Also, the package provides methods for calculating their gradient and the (diagonal elements of the) Hessian matrix.
The constructors for the types above are straightforward
```julia
KullbackLeibler()
ChiSqaured()
ReverseKullbackLeibler()
```
The `CressieRead` type define a family of divergences indexed by a parameter `alpha`. The constructor for `CressieRead` is
```julia
CR(::Real)
```
The Hellinger divergence is obtained by `CR(-1/2)`. For a certain value of `alpha`, `CressieRead` corresponds to a divergence with a defined specific type. For instance, `CR(1)` is equivalent to `ChiSquared` although the underlying code for evaluation and calculation of the gradient and Hessian are different.Three versions of each divergence in the above list are currently implemented: a vanilla version, a modified version, and a fully modified version. These modifications extend the domain of the divergence.
The **modified** version takes an additional argument that specifies the point at which a convex extension modifies the divergence.
```julia
ModifiedKullbackLeibler(theta::Real)
ModifiedReverseKullbackLeibler(theta::Real)
ModifiedCressieRead(alpha::Real, theta::Real)
```Similarly, the **fully modified** version takes two additional arguments that specify the points at which a convex extension modifies the divergence.
```julia
FullyModifiedKullbackLeibler(phi::Real, theta::Real)
FullyModifiedReverseKullbackLeibler(phi::Real, theta::Real)
FullyModifiedCressieRead(alpha::Real, phi::Real, theta::Real)
```## Basic usage
### Divergence between two vectors
Each divergence corresponds to a *divergence type*. You can always compute a certain divergence between two vectors using the following syntax
```julia
d = evaluate(div, x, y)
```Here, `div` is an instance of a divergence type. For example, the type for Kullback Leibler divergence is ``KullbackLeibler`` (more divergence types are described in some detail in what follows), then the Kullback Leibler divergence between ``x`` and ``y`` can be computed
```julia
d = evaluate(KullbackLeibler(), x, y)
```We can also calculate the divergence between the vector ``x`` and the unit vector
```julia
r = evaluate(KullbackLeibler(), x)
```The `Divergence` type is a subtype of `PreMetric` defined in the `Distances` package. As such, the divergences can be evaluated row-wise and column-wise for `X::Matrix` and `Y::Matrix`.
```julia
rowise(div, X, Y)
``````julia
colwise(div, X, Y)
```### Gradient of the divergence
To calculate the gradient of `div::Divergence` with respect to ``x::AbstractArray{Float64, 1}`` the
`gradient` method can be used
```julia
g = gradient(div, x, y)
```
or through its in-place version
```julia
gradient!(Array(Float64, size(x)), div, x, y)
```### Hessian of the divergence
The `hessian` method calculates the Hessian of the divergence with respect to ``x``
```julia
h = hessian(div, x, y)
```
Its in-place variant is also defined
```julia
hessian!(Array(Float64, size(x)), div, x, y)
```Notice that the the divergence's Hessian is sparse, where the diagonal entries are the only ones different from zero. For this reason, `hessian(div, x, y)` returns an `Array{Float64,1}` with the diagonal entries of the hessian.
## List of divergences
[To be added]