MclustDA discriminant analysis
Discriminant analysis based on Gaussian finite mixture modeling.
MclustDA(data, class, G = NULL, modelNames = NULL, modelType = c("MclustDA", "EDDA"), prior = NULL, control = emControl(), initialization = NULL, warn = mclust.options("warn"), verbose = interactive(), ...)
data |
A data frame or matrix giving the training data. |
class |
A vector giving the known class labels (either a numerical value or a character string) for the observations in the training data. |
G |
An integer vector specifying the numbers of mixture components
(clusters) for which the BIC is to be calculated within each class.
The default is |
modelNames |
A vector of character strings indicating the models to be fitted
by EM within each class (see the description in
|
modelType |
A character string specifying whether the models given in
|
prior |
The default assumes no prior, but this argument allows specification of a
conjugate prior on the means and variances through the function
|
control |
A list of control parameters for EM. The defaults are set by the call
|
initialization |
A list containing zero or more of the following components:
|
warn |
A logical value indicating whether or not certain warnings
(usually related to singularity) should be issued when
estimation fails.
The default is controlled by |
verbose |
A logical controlling if a text progress bar is displayed during the
fitting procedure. By default is |
... |
Further arguments passed to or from other methods. |
The "EDDA"
method for discriminant analysis is described in Bensmail and Celeux (1996), while "MclustDA"
in Fraley and Raftery (2002).
An object of class 'MclustDA'
providing the optimal (according
to BIC) mixture model.
The details of the output components are as follows:
call |
The matched call. |
data |
The input data matrix. |
class |
The input class labels. |
type |
A character string specifying the |
models |
A list of |
n |
The total number of observations in the data. |
d |
The dimension of the data. |
bic |
Optimal BIC value. |
loglik |
Log-likelihood for the selected model. |
df |
Number of estimated parameters. |
Luca Scrucca
Scrucca L., Fop M., Murphy T. B. and Raftery A. E. (2016) mclust 5: clustering, classification and density estimation using Gaussian finite mixture models, The R Journal, 8/1, pp. 289-317.
Fraley C. and Raftery A. E. (2002) Model-based clustering, discriminant analysis and density estimation, Journal of the American Statistical Association, 97/458, pp. 611-631.
Fraley C., Raftery A. E., Murphy T. B. and Scrucca L. (2012) mclust Version 4 for R: Normal Mixture Modeling for Model-Based Clustering, Classification, and Density Estimation. Technical Report No. 597, Department of Statistics, University of Washington.
Bensmail, H., and Celeux, G. (1996) Regularized Gaussian Discriminant Analysis Through Eigenvalue Decomposition.Journal of the American Statistical Association, 91, 1743-1748.
odd <- seq(from = 1, to = nrow(iris), by = 2) even <- odd + 1 X.train <- iris[odd,-5] Class.train <- iris[odd,5] X.test <- iris[even,-5] Class.test <- iris[even,5] # common EEE covariance structure (which is essentially equivalent to linear discriminant analysis) irisMclustDA <- MclustDA(X.train, Class.train, modelType = "EDDA", modelNames = "EEE") summary(irisMclustDA, parameters = TRUE) summary(irisMclustDA, newdata = X.test, newclass = Class.test) # common covariance structure selected by BIC irisMclustDA <- MclustDA(X.train, Class.train, modelType = "EDDA") summary(irisMclustDA, parameters = TRUE) summary(irisMclustDA, newdata = X.test, newclass = Class.test) # general covariance structure selected by BIC irisMclustDA <- MclustDA(X.train, Class.train) summary(irisMclustDA, parameters = TRUE) summary(irisMclustDA, newdata = X.test, newclass = Class.test) plot(irisMclustDA) plot(irisMclustDA, dimens = 3:4) plot(irisMclustDA, dimens = 4) plot(irisMclustDA, what = "classification") plot(irisMclustDA, what = "classification", newdata = X.test) plot(irisMclustDA, what = "classification", dimens = 3:4) plot(irisMclustDA, what = "classification", newdata = X.test, dimens = 3:4) plot(irisMclustDA, what = "classification", dimens = 4) plot(irisMclustDA, what = "classification", dimens = 4, newdata = X.test) plot(irisMclustDA, what = "train&test", newdata = X.test) plot(irisMclustDA, what = "train&test", newdata = X.test, dimens = 3:4) plot(irisMclustDA, what = "train&test", newdata = X.test, dimens = 4) plot(irisMclustDA, what = "error") plot(irisMclustDA, what = "error", dimens = 3:4) plot(irisMclustDA, what = "error", dimens = 4) plot(irisMclustDA, what = "error", newdata = X.test, newclass = Class.test) plot(irisMclustDA, what = "error", newdata = X.test, newclass = Class.test, dimens = 3:4) plot(irisMclustDA, what = "error", newdata = X.test, newclass = Class.test, dimens = 4) # simulated 1D data n <- 250 set.seed(1) triModal <- c(rnorm(n,-5), rnorm(n,0), rnorm(n,5)) triClass <- c(rep(1,n), rep(2,n), rep(3,n)) odd <- seq(from = 1, to = length(triModal), by = 2) even <- odd + 1 triMclustDA <- MclustDA(triModal[odd], triClass[odd]) summary(triMclustDA, parameters = TRUE) summary(triMclustDA, newdata = triModal[even], newclass = triClass[even]) plot(triMclustDA, what = "scatterplot") plot(triMclustDA, what = "classification") plot(triMclustDA, what = "classification", newdata = triModal[even]) plot(triMclustDA, what = "train&test", newdata = triModal[even]) plot(triMclustDA, what = "error") plot(triMclustDA, what = "error", newdata = triModal[even], newclass = triClass[even]) # simulated 2D cross data data(cross) odd <- seq(from = 1, to = nrow(cross), by = 2) even <- odd + 1 crossMclustDA <- MclustDA(cross[odd,-1], cross[odd,1]) summary(crossMclustDA, parameters = TRUE) summary(crossMclustDA, newdata = cross[even,-1], newclass = cross[even,1]) plot(crossMclustDA, what = "scatterplot") plot(crossMclustDA, what = "classification") plot(crossMclustDA, what = "classification", newdata = cross[even,-1]) plot(crossMclustDA, what = "train&test", newdata = cross[even,-1]) plot(crossMclustDA, what = "error") plot(crossMclustDA, what = "error", newdata =cross[even,-1], newclass = cross[even,1])
Please choose more modern alternatives, such as Google Chrome or Mozilla Firefox.