Construction of Specific Learners for CVST
These methods construct a CVST.learner
object suitable for the
CVST method. These objects provide the common interface needed for the
CV
and fastCV
methods. We provide kernel
logistic regression, kernel ridge regression, support vector machines
and support vector regression as fully functional implementation templates.
constructLearner(learn, predict) constructKlogRegLearner() constructKRRLearner() constructSVMLearner() constructSVRLearner()
learn |
The learning methods which takes a |
predict |
The prediction method which takes a model and |
The nu-SVM and nu-SVR are build on top the corresponding implementations of
the kernlab
package (see reference). In the list of parameters these
implementations expect an entry named kernel
, which gives the
name of the kernel that should be used, an entry named nu
specifying the nu parameter, and an entry named C
giving the C
parameter for the nu-SVR.
The KRR and KLR also expect kernel
and necessary other
parameters to construct the kernel. Both methods expect a lambda
parameter and KLR additonally a tol and maxiter parameter in the
parameter list.
Note that the lambda of KRR/KLR and the C parameter of SVR are scaled by the data set size to allow for comparable results in the fast CV loop.
Tammo Krueger <tammokrueger@googlemail.com>
Alexandros Karatzoglou, Alexandros Smola, Kurt Hornik, Achim Zeileis. kernlab - An S4 Package for Kernel Methods in R Journal of Statistical Software Vol. 11, Issue 9, Nov 2004. URL: http://www.jstatsoft.org/v11/i09.
Volker Roth. Probabilistic discriminative kernel classifiers for multi-class problems. In Proceedings of the 23rd DAGM-Symposium on Pattern Recognition, pages 246–253, 2001.
# SVM ns = noisySine(100) svm = constructSVMLearner() p = list(kernel="rbfdot", sigma=100, nu=.1) m = svm$learn(ns, p) nsTest = noisySine(1000) pred = svm$predict(m, nsTest) sum(pred != nsTest$y) / getN(nsTest) # Kernel logistic regression klr = constructKlogRegLearner() p = list(kernel="rbfdot", sigma=100, lambda=.1/getN(ns), tol=10e-6, maxiter=100) m = klr$learn(ns, p) pred = klr$predict(m, nsTest) sum(pred != nsTest$y) / getN(nsTest) # SVR ns = noisySinc(100) svr = constructSVRLearner() p = list(kernel="rbfdot", sigma=100, nu=.1, C=1*getN(ns)) m = svr$learn(ns, p) nsTest = noisySinc(1000) pred = svr$predict(m, nsTest) sum((pred - nsTest$y)^2) / getN(nsTest) # Kernel ridge regression krr = constructKRRLearner() p = list(kernel="rbfdot", sigma=100, lambda=.1/getN(ns)) m = krr$learn(ns, p) pred = krr$predict(m, nsTest) sum((pred - nsTest$y)^2) / getN(nsTest)
Please choose more modern alternatives, such as Google Chrome or Mozilla Firefox.