Become an expert in R — Interactive courses, Cheat Sheets, certificates and more!
Get Started for Free

MCMCregress

Markov Chain Monte Carlo for Gaussian Linear Regression


Description

This function generates a sample from the posterior distribution of a linear regression model with Gaussian errors using Gibbs sampling (with a multivariate Gaussian prior on the beta vector, and an inverse Gamma prior on the conditional error variance). The user supplies data and priors, and a sample from the posterior distribution is returned as an mcmc object, which can be subsequently analyzed with functions provided in the coda package.

Usage

MCMCregress(
  formula,
  data = NULL,
  burnin = 1000,
  mcmc = 10000,
  thin = 1,
  verbose = 0,
  seed = NA,
  beta.start = NA,
  b0 = 0,
  B0 = 0,
  c0 = 0.001,
  d0 = 0.001,
  sigma.mu = NA,
  sigma.var = NA,
  marginal.likelihood = c("none", "Laplace", "Chib95"),
  ...
)

Arguments

formula

Model formula.

data

Data frame.

burnin

The number of burn-in iterations for the sampler.

mcmc

The number of MCMC iterations after burnin.

thin

The thinning interval used in the simulation. The number of MCMC iterations must be divisible by this value.

verbose

A switch which determines whether or not the progress of the sampler is printed to the screen. If verbose is greater than 0 the iteration number, the β vector, and the error variance are printed to the screen every verboseth iteration.

seed

The seed for the random number generator. If NA, the Mersenne Twister generator is used with default seed 12345; if an integer is passed it is used to seed the Mersenne twister. The user can also pass a list of length two to use the L'Ecuyer random number generator, which is suitable for parallel computation. The first element of the list is the L'Ecuyer seed, which is a vector of length six or NA (if NA a default seed of rep(12345,6) is used). The second element of list is a positive substream number. See the MCMCpack specification for more details.

beta.start

The starting values for the β vector. This can either be a scalar or a column vector with dimension equal to the number of betas. The default value of of NA will use the OLS estimate of β as the starting value. If this is a scalar, that value will serve as the starting value mean for all of the betas.

b0

The prior mean of β. This can either be a scalar or a column vector with dimension equal to the number of betas. If this takes a scalar value, then that value will serve as the prior mean for all of the betas.

B0

The prior precision of β. This can either be a scalar or a square matrix with dimensions equal to the number of betas. If this takes a scalar value, then that value times an identity matrix serves as the prior precision of beta. Default value of 0 is equivalent to an improper uniform prior for beta.

c0

c_0/2 is the shape parameter for the inverse Gamma prior on σ^2 (the variance of the disturbances). The amount of information in the inverse Gamma prior is something like that from c_0 pseudo-observations.

d0

d_0/2 is the scale parameter for the inverse Gamma prior on σ^2 (the variance of the disturbances). In constructing the inverse Gamma prior, d_0 acts like the sum of squared errors from the c_0 pseudo-observations.

sigma.mu

The mean of the inverse Gamma prior on σ^2. sigma.mu and sigma.var allow users to choose the inverse Gamma prior by choosing its mean and variance.

sigma.var

The variacne of the inverse Gamma prior on σ^2. sigma.mu and sigma.var allow users to choose the inverse Gamma prior by choosing its mean and variance.

marginal.likelihood

How should the marginal likelihood be calculated? Options are: none in which case the marginal likelihood will not be calculated, Laplace in which case the Laplace approximation (see Kass and Raftery, 1995) is used, and Chib95 in which case the method of Chib (1995) is used.

...

further arguments to be passed.

Details

MCMCregress simulates from the posterior distribution using standard Gibbs sampling (a multivariate Normal draw for the betas, and an inverse Gamma draw for the conditional error variance). The simulation proper is done in compiled C++ code to maximize efficiency. Please consult the coda documentation for a comprehensive list of functions that can be used to analyze the posterior sample.

The model takes the following form:

y_i = x_i ' β + \varepsilon_{i}

Where the errors are assumed to be Gaussian:

\varepsilon_{i} \sim \mathcal{N}(0, σ^2)

We assume standard, semi-conjugate priors:

β \sim \mathcal{N}(b_0,B_0^{-1})

And:

σ^{-2} \sim \mathcal{G}amma(c_0/2, d_0/2)

Where β and σ^{-2} are assumed a priori independent. Note that only starting values for β are allowed because simulation is done using Gibbs sampling with the conditional error variance as the first block in the sampler.

Value

An mcmc object that contains the posterior sample. This object can be summarized by functions provided by the coda package.

References

Andrew D. Martin, Kevin M. Quinn, and Jong Hee Park. 2011. “MCMCpack: Markov Chain Monte Carlo in R.”, Journal of Statistical Software. 42(9): 1-21. https://www.jstatsoft.org/v42/i09/.

Siddhartha Chib. 1995. “Marginal Likelihood from the Gibbs Output.” Journal of the American Statistical Association. 90: 1313-1321.

Robert E. Kass and Adrian E. Raftery. 1995. “Bayes Factors.” Journal of the American Statistical Association. 90: 773-795.

Daniel Pemstein, Kevin M. Quinn, and Andrew D. Martin. 2007. Scythe Statistical Library 1.0. http://scythe.lsa.umich.edu.

Martyn Plummer, Nicky Best, Kate Cowles, and Karen Vines. 2006. “Output Analysis and Diagnostics for MCMC (CODA)”, R News. 6(1): 7-11. https://CRAN.R-project.org/doc/Rnews/Rnews_2006-1.pdf.

See Also

Examples

## Not run: 
line   <- list(X = c(-2,-1,0,1,2), Y = c(1,3,3,3,5))
posterior  <- MCMCregress(Y~X, b0=0, B0 = 0.1,
	      sigma.mu = 5, sigma.var = 25, data=line, verbose=1000)
plot(posterior)
raftery.diag(posterior)
summary(posterior)

## End(Not run)

MCMCpack

Markov Chain Monte Carlo (MCMC) Package

v1.5-0
GPL-3
Authors
Andrew D. Martin [aut], Kevin M. Quinn [aut], Jong Hee Park [aut,cre], Ghislain Vieilledent [ctb], Michael Malecki[ctb], Matthew Blackwell [ctb], Keith Poole [ctb], Craig Reed [ctb], Ben Goodrich [ctb], Ross Ihaka [cph], The R Development Core Team [cph], The R Foundation [cph], Pierre L'Ecuyer [cph], Makoto Matsumoto [cph], Takuji Nishimura [cph]
Initial release
2021-01-19

We don't support your browser anymore

Please choose more modern alternatives, such as Google Chrome or Mozilla Firefox.