Become an expert in R — Interactive courses, Cheat Sheets, certificates and more!
Get Started for Free

xgb.shap.data

Prepare data for SHAP plots. To be used in xgb.plot.shap, xgb.plot.shap.summary, etc. Internal utility function.


Description

Prepare data for SHAP plots. To be used in xgb.plot.shap, xgb.plot.shap.summary, etc. Internal utility function.

Usage

xgb.shap.data(
  data,
  shap_contrib = NULL,
  features = NULL,
  top_n = 1,
  model = NULL,
  trees = NULL,
  target_class = NULL,
  approxcontrib = FALSE,
  subsample = NULL,
  max_observations = 1e+05
)

Arguments

data

data as a matrix or dgCMatrix.

shap_contrib

a matrix of SHAP contributions that was computed earlier for the above data. When it is NULL, it is computed internally using model and data.

features

a vector of either column indices or of feature names to plot. When it is NULL, feature importance is calculated, and top_n high ranked features are taken.

top_n

when features is NULL, top_n [1, 100] most important features in a model are taken.

model

an xgb.Booster model. It has to be provided when either shap_contrib or features is missing.

trees

passed to xgb.importance when features = NULL.

target_class

is only relevant for multiclass models. When it is set to a 0-based class index, only SHAP contributions for that specific class are used. If it is not set, SHAP importances are averaged over all classes.

approxcontrib

passed to predict.xgb.Booster when shap_contrib = NULL.

subsample

a random fraction of data points to use for plotting. When it is NULL, it is set so that up to 100K data points are used.

Value

A list containing: 'data', a matrix containing sample observations and their feature values; 'shap_contrib', a matrix containing the SHAP contribution values for these observations.


xgboost

Extreme Gradient Boosting

v1.4.1.1
Apache License (== 2.0) | file LICENSE
Authors
Tianqi Chen [aut], Tong He [aut, cre], Michael Benesty [aut], Vadim Khotilovich [aut], Yuan Tang [aut] (<https://orcid.org/0000-0001-5243-233X>), Hyunsu Cho [aut], Kailong Chen [aut], Rory Mitchell [aut], Ignacio Cano [aut], Tianyi Zhou [aut], Mu Li [aut], Junyuan Xie [aut], Min Lin [aut], Yifeng Geng [aut], Yutian Li [aut], XGBoost contributors [cph] (base XGBoost implementation)
Initial release
2021-04-22

We don't support your browser anymore

Please choose more modern alternatives, such as Google Chrome or Mozilla Firefox.