gibbs.lcm {lca}R Documentation

Gibbs Sampling for Posterior of Latent Class Model

Description

Use Gibbs sampling to explore the posterior of a Latent Class Model with Dirichlet priors.

Usage

gibbs.lcm(dat, H, N = 10000, n.thin = 1, n.burn = 1000, prior.theta = rep(1, H), prior.eta = NULL, start.theta = NULL, start.eta = NULL, na.ignore = FALSE, lls = TRUE, verbose = TRUE)

Arguments

dat an object of class freq.table containing the data to be analysed.
H number of latent classes.
N number of iterations in main run.
n.thin thinning factor for main run - used to save memory for large N.
n.burn number of iterations during burn-in
prior.theta numeric vector of length H containing Dirichlet prior parameters for latent class proportions.
prior.eta array containing Dirichlet prior parameters on other parameters.
start.theta numeric vector of length H containing initial parameter values for latent class proportions.
start.eta numeric array containing initial parameter values for other parameters.
na.ignore logical - should missing values be ignored?
lls logical - should log-likelihood at each iteration be recorded?
verbose logical - should progress be sent to stdout?

Details

Function runs a Gibbs sampling algorithm on a Latent Class Model posterior, assuming Dirichlet priors; since these are conjugate, we also have Dirichlet posteriors, conditionally on other variables. All prior parameters default to 1 if unspecified, meaning that the posterior is equal to the likelihood for the latent class model.

If na.ignore is set to FALSE (the default) then missing values are imputed as unknown parameters (much like the latent classes), assuming that they are 'missing at random'.

Value

An object of class lcm.gibbs.

[[1]] a numeric matrix containing parameter estimates at each iteration. Each row represents a single saved iteration, and each column a parameter, thus there are N/n.thin rows in total.
H H.
J the number of items.
K a numeric vector containing the number of possible responses to each item.
dat dat.
ll a list containing the value of the log-likelihood at each iteration.

Warning

The Gibbs sampler performs poorly for multimodal posteriors (which are common in Latent Class Models), and will likely fail to explore the entire space.

Author(s)

Robin Evans

References

Goodman Book on missing data analysis

See Also

summary.lcm.gibbs

Examples

data(question)

out = gibbs.lcm(question, 3, N=1e4, n.burn=1e3)

summary(out)

[Package lca version 0.2 Index]