Become an expert in R — Interactive courses, Cheat Sheets, certificates and more!
Get Started for Free

depth.space.potential

Calculate Potential Space


Description

Calculates the representation of the training classes in potential space.

Usage

depth.space.potential(data, cardinalities, pretransform = "NMom", 
            kernel = "GKernel", kernel.bandwidth = NULL, mah.parMcd = 0.75)

Arguments

data

Matrix containing training sample where each row is a d-dimensional object, and objects of each class are kept together so that the matrix can be thought of as containing blocks of objects representing classes.

cardinalities

Numerical vector of cardinalities of each class in data, each entry corresponds to one class.

pretransform

The method of data scaling.

NULL to use the original data,

The data may be scaled jointly or separately:

1Mom or 1MCD for joint scaling of the classes,

NMom or NMCD for separate scaling of the classes.

You may use traditional moments or Minimum Covariance Determinant (MCD) estimates for mean and covariance:

1Mom or NMom for scaling using traditional data moments,

1MCD or NMCD for scaling using robust MCD data moments.

kernel

"EDKernel" for the kernel of type 1/(1+kernel.bandwidth*EuclidianDistance2(x, y)),

"GKernel" [default and recommended] for the simple Gaussian kernel,

"EKernel" exponential kernel: exp(-kernel.bandwidth*EuclidianDistance(x, y)),

"VarGKernel" variable Gaussian kernel, where kernel.bandwidth is proportional to the depth.zonoid of a point.

kernel.bandwidth

the bandwidth parameter of the kernel. If NULL - the Scott's rule of thumb is used. May be a single value for all classes, or a vector of values for each of the classes.

mah.parMcd

is the value of the argument alpha for the function covMcd; is used when pretransform = "*MCD".

Details

The potential representation is calculated in the same way as in depth.potential, see References below for more information and details.

Value

Matrix of objects, each object (row) is represented via its potentials (columns) w.r.t. each of the classes of the training sample; order of the classes in columns corresponds to the one in the argument cardinalities.

References

Aizerman, M.A., Braverman, E.M., and Rozonoer, L.I. (1970). The Method of Potential Functions in the Theory of Machine Learning. Nauka (Moscow).

Pokotylo, O. and Mosler, K. (2015). Classification with the pot-pot plot. Mimeo.

See Also

ddalpha.train and ddalpha.classify for application, depth.potential for calculation of the potential.

Examples

# Generate a bivariate normal location-shift classification task
# containing 20 training objects
class1 <- mvrnorm(50, c(0,0), 
                  matrix(c(1,1,1,4), nrow = 2, ncol = 2, byrow = TRUE))
class2 <- mvrnorm(50, c(1,1), 
                  matrix(c(1,1,1,4), nrow = 2, ncol = 2, byrow = TRUE))
data <- rbind(class1, class2)
plot(data, col = c(rep(1,50), rep(2,50)))
# potential with rule of thumb bandwidth
ds = depth.space.potential(data, c(50, 50))
# draw.ddplot(depth.space = ds, cardinalities = c(50, 50))

# potential with bandwidth = 0.5 and joint scaling
ds = depth.space.potential(data, c(50, 50), kernel.bandwidth = 0.5,
                           pretransform = "1Mom")
# draw.ddplot(depth.space = ds, cardinalities = c(50, 50))

# potential with bandwidth = 0.5 and separate scaling
ds = depth.space.potential(data, c(50, 50), kernel.bandwidth = 0.5, 
                           pretransform = "NahMom") # or without pretransform
# draw.ddplot(depth.space = ds, cardinalities = c(50, 50))

data <- getdata("hemophilia")
cardinalities = c(sum(data$gr == "normal"), sum(data$gr == "carrier"))
ds = depth.space.potential(data[,1:2], cardinalities)
# draw.ddplot(depth.space = ds, cardinalities = cardinalities)

ddalpha

Depth-Based Classification and Calculation of Data Depth

v1.3.11
GPL-2
Authors
Oleksii Pokotylo [aut, cre], Pavlo Mozharovskyi [aut], Rainer Dyckerhoff [aut], Stanislav Nagy [aut]
Initial release
2020-01-09

We don't support your browser anymore

Please choose more modern alternatives, such as Google Chrome or Mozilla Firefox.