RMSProp optimizer
RMSProp optimizer
optimizer_rmsprop( lr = 0.001, rho = 0.9, epsilon = NULL, decay = 0, clipnorm = NULL, clipvalue = NULL )
lr |
float >= 0. Learning rate. |
rho |
float >= 0. Decay factor. |
epsilon |
float >= 0. Fuzz factor. If |
decay |
float >= 0. Learning rate decay over each update. |
clipnorm |
Gradients will be clipped when their L2 norm exceeds this value. |
clipvalue |
Gradients will be clipped when their absolute value exceeds this value. |
It is recommended to leave the parameters of this optimizer at their default values (except the learning rate, which can be freely tuned).
This optimizer is usually a good choice for recurrent neural networks.
Other optimizers:
optimizer_adadelta()
,
optimizer_adagrad()
,
optimizer_adamax()
,
optimizer_adam()
,
optimizer_nadam()
,
optimizer_sgd()
Please choose more modern alternatives, such as Google Chrome or Mozilla Firefox.