Apply (Smoothed) Rectified Linear Transformation
step_relu
creates a specification of a recipe step that
will apply the rectified linear or softplus transformations to numeric
data. The transformed data is added as new columns to the data matrix.
step_relu( recipe, ..., role = "predictor", trained = FALSE, shift = 0, reverse = FALSE, smooth = FALSE, prefix = "right_relu_", columns = NULL, skip = FALSE, id = rand_id("relu") ) ## S3 method for class 'step_relu' tidy(x, ...)
recipe |
A recipe object. The step will be added to the sequence of operations for this recipe. |
... |
One or more selector functions to choose which variables are
affected by the step. See |
role |
Defaults to "predictor". |
trained |
A logical to indicate if the quantities for preprocessing have been estimated. |
shift |
A numeric value dictating a translation to apply to the data. |
reverse |
A logical to indicate if the left hinge should be used as opposed to the right hinge. |
smooth |
A logical indicating if the softplus function, a smooth approximation to the rectified linear transformation, should be used. |
prefix |
A prefix for generated column names, default to "right_relu_" when right hinge transformation and "left_relu_" for reversed/left hinge transformations. |
columns |
A character string of variable names that will
be populated (eventually) by the |
skip |
A logical. Should the step be skipped when the
recipe is baked by |
id |
A character string that is unique to this step to identify it. |
x |
A |
The rectified linear transformation is calculated as
max(0, x - c)
and is also known as the ReLu or right hinge function.
If reverse
is true, then the transformation is reflected about the
y-axis, like so:
max(0, c - x)
Setting the smooth
option
to true will instead calculate a smooth approximation to ReLu
according to
ln(1 + e^(x - c)
The reverse
argument may
also be applied to this transformation.
An updated version of recipe
with the
new step added to the sequence of existing steps (if any).
The rectified linear transformation is used in Multivariate Adaptive Regression Splines as a basis function to fit piecewise linear functions to data in a strategy similar to that employed in tree based models. The transformation is a popular choice as an activation function in many neural networks, which could then be seen as a stacked generalization of MARS when making use of ReLu activations. The hinge function also appears in the loss function of Support Vector Machines, where it penalizes residuals only if they are within a certain margin of the decision boundary.
library(modeldata) data(biomass) biomass_tr <- biomass[biomass$dataset == "Training",] biomass_te <- biomass[biomass$dataset == "Testing",] rec <- recipe(HHV ~ carbon + hydrogen + oxygen + nitrogen + sulfur, data = biomass_tr) transformed_te <- rec %>% step_relu(carbon, shift = 40) %>% prep(biomass_tr) %>% bake(biomass_te) transformed_te
Please choose more modern alternatives, such as Google Chrome or Mozilla Firefox.