Become an expert in R — Interactive courses, Cheat Sheets, certificates and more!
Get Started for Free

kl_div

Kullback-Leibler Divergence


Description

The elementwise Kullback-Leibler divergence, x\log(x/y) - x + y.

Usage

kl_div(x, y)

Arguments

x

An Expression, vector, or matrix.

y

An Expression, vector, or matrix.

Value

An Expression representing the KL-divergence of the input.

Examples

n <- 5
alpha <- seq(10, n-1+10)/n
beta <- seq(10, n-1+10)/n
P_tot <- 0.5
W_tot <- 1.0

P <- Variable(n)
W <- Variable(n)
R <- kl_div(alpha*W, alpha*(W + beta*P)) - alpha*beta*P
obj <- sum(R)
constr <- list(P >= 0, W >= 0, sum(P) == P_tot, sum(W) == W_tot)
prob <- Problem(Minimize(obj), constr)
result <- solve(prob)

result$value
result$getValue(P)
result$getValue(W)

CVXR

Disciplined Convex Optimization

v1.0-10
Apache License 2.0 | file LICENSE
Authors
Anqi Fu [aut, cre], Balasubramanian Narasimhan [aut], David W Kang [aut], Steven Diamond [aut], John Miller [aut], Stephen Boyd [ctb], Paul Kunsberg Rosenfield [ctb]
Initial release

We don't support your browser anymore

Please choose more modern alternatives, such as Google Chrome or Mozilla Firefox.