General Interface for Single Layer Neural Network
mlp()
, for multilayer perceptron, is a way to generate a specification of
a model before fitting and allows the model to be created using
different packages in R or via keras The main arguments for the
model are:
hidden_units
: The number of units in the hidden layer
(default: 5).
penalty
: The amount of L2 regularization (aka weight
decay, default is zero).
dropout
: The proportion of parameters randomly dropped out of
the model (keras
only, default is zero).
epochs
: The number of training iterations (default: 20).
activation
: The type of function that connects the hidden
layer and the input variables (keras
only, default is softmax).
If parameters need to be modified, this function can be used in lieu of recreating the object from scratch.
mlp( mode = "unknown", hidden_units = NULL, penalty = NULL, dropout = NULL, epochs = NULL, activation = NULL ) ## S3 method for class 'mlp' update( object, parameters = NULL, hidden_units = NULL, penalty = NULL, dropout = NULL, epochs = NULL, activation = NULL, fresh = FALSE, ... )
mode |
A single character string for the type of model. Possible values for this model are "unknown", "regression", or "classification". |
hidden_units |
An integer for the number of units in the hidden model. |
penalty |
A non-negative numeric value for the amount of weight decay. |
dropout |
A number between 0 (inclusive) and 1 denoting the proportion of model parameters randomly set to zero during model training. |
epochs |
An integer for the number of training iterations. |
activation |
A single character string denoting the type of relationship between the original predictors and the hidden unit layer. The activation function between the hidden and output layers is automatically set to either "linear" or "softmax" depending on the type of outcome. Possible values are: "linear", "softmax", "relu", and "elu" |
object |
A multilayer perceptron model specification. |
parameters |
A 1-row tibble or named list with main
parameters to update. If the individual arguments are used,
these will supersede the values in |
fresh |
A logical for whether the arguments should be modified in-place of or replaced wholesale. |
... |
Not used for |
These arguments are converted to their specific names at the
time that the model is fit. Other options and arguments can be
set using set_engine()
. If left to their defaults
here (see above), the values are taken from the underlying model
functions. One exception is hidden_units
when nnet::nnet
is used; that
function's size
argument has no default so a value of 5 units will be
used. Also, unless otherwise specified, the linout
argument to
nnet::nnet()
will be set to TRUE
when a regression model is created.
If parameters need to be modified, update()
can be used
in lieu of recreating the object from scratch.
The model can be created using the fit()
function using the
following engines:
R: "nnet"
(the default)
keras: "keras"
Engines may have pre-set default arguments when executing the model fit call. For this type of model, the template of the fit calls are below:
mlp() %>% set_engine("keras") %>% set_mode("regression") %>% translate()
## Single Layer Neural Network Specification (regression) ## ## Computational engine: keras ## ## Model fit template: ## parsnip::keras_mlp(x = missing_arg(), y = missing_arg())
mlp() %>% set_engine("keras") %>% set_mode("classification") %>% translate()
## Single Layer Neural Network Specification (classification) ## ## Computational engine: keras ## ## Model fit template: ## parsnip::keras_mlp(x = missing_arg(), y = missing_arg())
An error is thrown if both penalty
and dropout
are specified for
keras
models.
mlp() %>% set_engine("nnet") %>% set_mode("regression") %>% translate()
## Single Layer Neural Network Specification (regression) ## ## Main Arguments: ## hidden_units = 5 ## ## Computational engine: nnet ## ## Model fit template: ## nnet::nnet(formula = missing_arg(), data = missing_arg(), weights = missing_arg(), ## size = 5, trace = FALSE, linout = TRUE)
mlp() %>% set_engine("nnet") %>% set_mode("classification") %>% translate()
## Single Layer Neural Network Specification (classification) ## ## Main Arguments: ## hidden_units = 5 ## ## Computational engine: nnet ## ## Model fit template: ## nnet::nnet(formula = missing_arg(), data = missing_arg(), weights = missing_arg(), ## size = 5, trace = FALSE, linout = FALSE)
The standardized parameter names in parsnip can be mapped to their original names in each engine that has main parameters. Each engine typically has a different default value (shown in parentheses) for each parameter.
parsnip | keras | nnet |
hidden_units | hidden_units (5) | size |
penalty | penalty (0) | decay (0) |
dropout | dropout (0) | NA |
epochs | epochs (20) | maxit (100) |
activation | activation (softmax) | NA |
show_engines("mlp") mlp(mode = "classification", penalty = 0.01) # Parameters can be represented by a placeholder: mlp(mode = "regression", hidden_units = varying()) model <- mlp(hidden_units = 10, dropout = 0.30) model update(model, hidden_units = 2) update(model, hidden_units = 2, fresh = TRUE)
Please choose more modern alternatives, such as Google Chrome or Mozilla Firefox.