assess performance of a 'glmnet' object using test data.
Given a test set, produce summary performance measures for the glmnet model(s)
assess.glmnet( object, newx = NULL, newy, weights = NULL, family = c("gaussian", "binomial", "poisson", "multinomial", "cox", "mgaussian"), ... ) confusion.glmnet( object, newx = NULL, newy, family = c("binomial", "multinomial"), ... ) roc.glmnet(object, newx = NULL, newy, ...)
object |
Fitted |
newx |
If predictions are to made, these are the 'x' values. Required
for |
newy |
required argument for all functions; the new response values |
weights |
For observation weights for the test observations |
family |
The family of the model, in case predictions are passed in as 'object' |
... |
additional arguments to |
assess.glmnet
produces all the different performance measures
provided by cv.glmnet
for each of the families. A single vector, or a
matrix of predictions can be provided, or fitted model objects or CV
objects. In the case when the predictions are still to be made, the
...
arguments allow, for example, 'offsets' and other prediction
parameters such as values for 'gamma' for 'relaxed' fits. roc.glmnet
produces for a single vector a two column matrix with columns TPR and FPR
(true positive rate and false positive rate). This object can be plotted to
produce an ROC curve. If more than one predictions are called for, then a
list of such matrices is produced. confusion.glmnet
produces a
confusion matrix tabulating the classification results. Again, a single
table or a list, with a print method.
assess.glmnet
produces a list of vectors of measures.
roc.glmnet
a list of 'roc' two-column matrices, and
confusion.glmnet
a list of tables. If a single prediction is
provided, or predictions are made from a CV object, the latter two drop the
list status and produce a single matrix or table.
Trevor Hastie and Rob Tibshirani
Maintainer: Trevor Hastie
hastie@stanford.edu
cv.glmnet
, glmnet.measures
and vignette("relax",package="glmnet")
data(QuickStartExample) set.seed(11) train = sample(seq(length(y)),70,replace=FALSE) fit1 = glmnet(x[train,], y[train]) assess.glmnet(fit1, newx = x[-train,], newy = y[-train]) preds = predict(fit1, newx = x[-train, ], s = c(1, 0.25)) assess.glmnet(preds, newy = y[-train], family = "gaussian") fit1c = cv.glmnet(x, y, keep = TRUE) fit1a = assess.glmnet(fit1c$fit.preval, newy=y,family="gaussian") plot(fit1c$lambda, log="x",fit1a$mae,xlab="Log Lambda",ylab="Mean Absolute Error") abline(v=fit1c$lambda.min, lty=2, col="red") data(BinomialExample) fit2 = glmnet(x[train,], y[train], family = "binomial") assess.glmnet(fit2,newx = x[-train,], newy=y[-train], s=0.1) plot(roc.glmnet(fit2, newx = x[-train,], newy=y[-train])[[10]]) fit2c = cv.glmnet(x, y, family = "binomial", keep=TRUE) idmin = match(fit2c$lambda.min, fit2c$lambda) plot(roc.glmnet(fit2c$fit.preval, newy = y)[[idmin]]) data(MultinomialExample) set.seed(103) train = sample(seq(length(y)),100,replace=FALSE) fit3 = glmnet(x[train,], y[train], family = "multinomial") confusion.glmnet(fit3, newx = x[-train, ], newy = y[-train], s = 0.01) fit3c = cv.glmnet(x, y, family = "multinomial", type.measure="class", keep=TRUE) idmin = match(fit3c$lambda.min, fit3c$lambda) confusion.glmnet(fit3c$fit.preval, newy = y, family="multinomial")[[idmin]]
Please choose more modern alternatives, such as Google Chrome or Mozilla Firefox.