Become an expert in R — Interactive courses, Cheat Sheets, certificates and more!
Get Started for Free

kullback_leibler_divergence

Kullback-Leibler divergence


Description

Kullback-Leibler divergence

Usage

kullback_leibler_divergence(x, y)

Arguments

x, y

Numeric vectors representing probabilities

Details

Kullback-Leibler divergence is a non-symmetric measure of difference between two probability vectors. In general, KL(x, y) is not equal to KL(y, x).

Because this measure is defined for probabilities, the vectors x and y are normalized in the function so they sum to 1.

Value

The Kullback-Leibler divergence between x and y. We adopt the following conventions if elements of x or y are zero: 0 \log (0 / y_i) = 0, 0 \log (0 / 0) = 0, and x_i \log (x_i / 0) = ∞. As a result, if elements of x are zero, they do not contribute to the sum. If elements of y are zero where x is nonzero, the result will be Inf. If either x or y sum to zero, we are not able to compute the proportions, and we return NaN.


abdiv

Alpha and Beta Diversity Measures

v0.2.0
MIT + file LICENSE
Authors
Kyle Bittinger [aut, cre]
Initial release

We don't support your browser anymore

Please choose more modern alternatives, such as Google Chrome or Mozilla Firefox.