Gibbs Sampler for Normal Mixtures
rnmixGibbs
implements a Gibbs Sampler for normal mixtures.
rnmixGibbs(Data, Prior, Mcmc)
Data |
list(y) |
Prior |
list(Mubar, A, nu, V, a, ncomp) |
Mcmc |
list(R, keep, nprint, Loglike) |
y_i ~ N(μ_{ind_i}, Σ_{ind_i})
ind ~ iid multinomial(p) with p an ncomp x 1 vector of probs
μ_j ~ N(mubar, Σ_j (x) A^{-1}) with mubar=vec(Mubar)
Σ_j ~ IW(nu, V)
Note: this is the natural conjugate prior – a special case of multivariate regression
p ~ Dirchlet(a)
Data = list(y)
y: |
n x k matrix of data (rows are obs) |
Prior = list(Mubar, A, nu, V, a, ncomp)
[only ncomp
required]
Mubar: |
1 x k vector with prior mean of normal component means (def: 0) |
A: |
1 x 1 precision parameter for prior on mean of normal component (def: 0.01) |
nu: |
d.f. parameter for prior on Sigma (normal component cov matrix) (def: k+3) |
V: |
k x k location matrix of IW prior on Sigma (def: nu*I) |
a: |
ncomp x 1 vector of Dirichlet prior parameters (def: rep(5,ncomp) ) |
ncomp: |
number of normal components to be included |
Mcmc = list(R, keep, nprint, Loglike)
[only R
required]
R: |
number of MCMC draws |
keep: |
MCMC thinning parameter -- keep every keep th draw (def: 1) |
nprint: |
print the estimated time remaining for every nprint 'th draw (def: 100, set to 0 for no print) |
LogLike: |
logical flag for whether to compute the log-likelihood (def: FALSE )
|
nmix
Detailsnmix
is a list with 3 components. Several functions in the bayesm
package that involve a Dirichlet Process or mixture-of-normals return nmix
. Across these functions, a common structure is used for nmix
in order to utilize generic summary and plotting functions.
probdraw: |
ncomp x R/keep matrix that reports the probability that each draw came from a particular component |
zdraw: |
R/keep x nobs matrix that indicates which component each draw is assigned to |
compdraw: |
A list of R/keep lists of ncomp lists. Each of the inner-most lists has 2 elemens: a vector of draws for mu and a matrix of draws for the Cholesky root of Sigma .
|
A list containing:
ll |
R/keep x 1 vector of log-likelihood values |
nmix |
a list containing: |
In this model, the component normal parameters are not-identified due to label-switching. However, the fitted mixture of normals density is identified as it is invariant to label-switching. See chapter 5 of Rossi et al below for details.
Use eMixMargDen
or momMix
to compute posterior expectation or distribution of various identified parameters.
Peter Rossi, Anderson School, UCLA, perossichi@gmail.com.
For further discussion, see Chapter 3, Bayesian Statistics and Marketing by Rossi, Allenby, and McCulloch.
http://www.perossi.org/home/bsm-1
if(nchar(Sys.getenv("LONG_TEST")) != 0) {R=2000} else {R=10} set.seed(66) dim = 5 k = 3 # dimension of simulated data and number of "true" components sigma = matrix(rep(0.5,dim^2), nrow=dim) diag(sigma) = 1 sigfac = c(1,1,1) mufac = c(1,2,3) compsmv = list() for(i in 1:k) compsmv[[i]] = list(mu=mufac[i]*1:dim, sigma=sigfac[i]*sigma) # change to "rooti" scale comps = list() for(i in 1:k) comps[[i]] = list(mu=compsmv[[i]][[1]], rooti=solve(chol(compsmv[[i]][[2]]))) pvec = (1:k) / sum(1:k) nobs = 500 dm = rmixture(nobs, pvec, comps) Data1 = list(y=dm$x) ncomp = 9 Prior1 = list(ncomp=ncomp) Mcmc1 = list(R=R, keep=1) out = rnmixGibbs(Data=Data1, Prior=Prior1, Mcmc=Mcmc1) cat("Summary of Normal Mixture Distribution", fill=TRUE) summary(out$nmix) tmom = momMix(matrix(pvec,nrow=1), list(comps)) mat = rbind(tmom$mu, tmom$sd) cat(" True Mean/Std Dev", fill=TRUE) print(mat) ## plotting examples if(0){plot(out$nmix,Data=dm$x)}
Please choose more modern alternatives, such as Google Chrome or Mozilla Firefox.