Functional DD-Classifier
Trains the functional DD-classifier
ddalphaf.train(dataf, labels, subset, adc.args = list(instance = "avr", numFcn = -1, numDer = -1), classifier.type = c("ddalpha", "maxdepth", "knnaff", "lda", "qda"), cv.complete = FALSE, maxNumIntervals = min(25, ceiling(length(dataf[[1]]$args)/2)), seed = 0, ...)
dataf |
list containing lists (functions) of two vectors of equal length, named "args" and "vals": arguments sorted in ascending order and corresponding them values respectively |
labels |
list of output labels of the functional observations |
subset |
an optional vector specifying a subset of observations to be used in training the classifier. |
adc.args |
Represents a function sample as a multidimensional (dimension= First two named
Set Set |
classifier.type |
the classifier which is used on the transformed space. The default value is 'ddalpha'. |
cv.complete |
T: apply complete cross-validation |
maxNumIntervals |
maximal number of intervals for cross-validation ( max(numFcn + numDer) = maxNumIntervals ) |
seed |
the random seed. The default value |
... |
additional parameters, passed to the classifier, selected with parameter |
The functional DD-classifier is fast nonparametric procedure for classifying functional data. It consists of a two-step transformation of the original data plus a classifier operating on a low-dimensional hypercube. The functional data are first mapped into a finite-dimensional location-slope space and then transformed by a multivariate depth function into the DD-plot, which is a subset of the unit hypercube. This transformation yields a new notion of depth for functional data. Three alternative depth functions are employed for this, as well as two rules for the final classification. The resulting classifier is cross-validated over a small range of parameters only, which is restricted by a Vapnik-Cervonenkis bound. The entire methodology does not involve smoothing techniques, is completely nonparametric and allows to achieve Bayes optimality under standard distributional settings. It is robust and efficiently computable.
Trained functional DD-classifier
Mosler, K. and Mozharovskyi, P. (2017). Fast DD-classification of functional data. Statistical Papers 58 1055–1089.
Mozharovskyi, P. (2015). Contributions to Depth-based Classification and Computation of the Tukey Depth. Verlag Dr. Kovac (Hamburg).
ddalphaf.classify
for classification using functional DDα-classifier,
compclassf.train
to train the functional componentwise classifier,
dataf.*
for functional data sets included in the package.
## Not run: ## load the Growth dataset dataf = dataf.growth() learn = c(head(dataf$dataf, 49), tail(dataf$dataf, 34)) labels= c(head(dataf$labels, 49), tail(dataf$labels, 34)) test = tail(head(dataf$dataf, 59), 10) # elements 50:59. 5 girls, 5 boys #cross-validate over the whole variants up to dimension 3 c1 = ddalphaf.train (learn, labels, classifier.type = "ddalpha", maxNumIntervals = 3) classified1 = ddalphaf.classify(c1, test) print(unlist(classified1)) print(c1$adc.args) # cross-validate over these two variants c2 = ddalphaf.train (learn, labels, classifier.type = "ddalpha", adc.args = list( list(instance = "avr", numFcn = 1, numDer = 2), list(instance = "avr", numFcn = 0, numDer = 2))) classified2 = ddalphaf.classify(c2, test) print(unlist(classified2)) print(c2$adc.args) ## End(Not run)
Please choose more modern alternatives, such as Google Chrome or Mozilla Firefox.