Kullback-Leibler Divergence
The elementwise Kullback-Leibler divergence, x\log(x/y) - x + y.
kl_div(x, y)
x |
An Expression, vector, or matrix. |
y |
An Expression, vector, or matrix. |
An Expression representing the KL-divergence of the input.
n <- 5 alpha <- seq(10, n-1+10)/n beta <- seq(10, n-1+10)/n P_tot <- 0.5 W_tot <- 1.0 P <- Variable(n) W <- Variable(n) R <- kl_div(alpha*W, alpha*(W + beta*P)) - alpha*beta*P obj <- sum(R) constr <- list(P >= 0, W >= 0, sum(P) == P_tot, sum(W) == W_tot) prob <- Problem(Minimize(obj), constr) result <- solve(prob) result$value result$getValue(P) result$getValue(W)
Please choose more modern alternatives, such as Google Chrome or Mozilla Firefox.