Boosted trees via xgboost
xgb_train
is a wrapper for xgboost
tree-based models where all of the
model arguments are in the main function.
xgb_train( x, y, max_depth = 6, nrounds = 15, eta = 0.3, colsample_bytree = 1, min_child_weight = 1, gamma = 0, subsample = 1, validation = 0, early_stop = NULL, objective = NULL, ... )
x |
A data frame or matrix of predictors |
y |
A vector (factor or numeric) or matrix (numeric) of outcome data. |
max_depth |
An integer for the maximum depth of the tree. |
nrounds |
An integer for the number of boosting iterations. |
eta |
A numeric value between zero and one to control the learning rate. |
colsample_bytree |
Subsampling proportion of columns. |
min_child_weight |
A numeric value for the minimum sum of instance weights needed in a child to continue to split. |
gamma |
A number for the minimum loss reduction required to make a further partition on a leaf node of the tree |
subsample |
Subsampling proportion of rows. |
validation |
A positive number. If on |
early_stop |
An integer or |
objective |
A single string (or NULL) that defines the loss function that
|
... |
Other options to pass to |
A fitted xgboost
object.
Please choose more modern alternatives, such as Google Chrome or Mozilla Firefox.