public class RmsProp extends Optimizer
RMSProp Optimizer.
Two versions of RMSProp are implemented.
If `centered = False`, the algorithm described in http://www.cs.toronto.edu/~tijmen/csc321/slides/lecture_slides_lec6.pdf by Tieleman and Hinton, 2012 is used.
If `centered = True`, the algorithm described in http://arxiv.org/pdf/1308.0850v5.pdf (38)-(45) by Alex Graves, 2013 is used instead.
Default version is `centered = False`.
If `centered = False`:
RMSProp updates the weights using:
\( var = rho * var + (1 - rho) * grad^2 \)
\( weight -= learning_rate * (sqrt(v) + epsilon) \)
If `centered = True`: \( mean = rho * mean + (1 - rho) * grad \)
\( var = rho * var + (1 - rho) * grad^2 \)
\( mom = mom^2 - lr * grad / sqrt(var - mean^2) + epsilon \)
\( weight = mean / (sqrt(var) + epsilon) \)
Grad represents the gradient, mean and var are the 1st and 2nd order moment estimates (mean and
variance), and mom is the momentum.
| Modifier and Type | Class and Description |
|---|---|
static class |
RmsProp.Builder
The Builder to construct an
RmsProp object. |
Optimizer.OptimizerBuilder<T extends Optimizer.OptimizerBuilder>clipGrad, rescaleGrad| Modifier | Constructor and Description |
|---|---|
protected |
RmsProp(RmsProp.Builder builder)
Creates a new instance of
RMSProp optimizer. |
| Modifier and Type | Method and Description |
|---|---|
static RmsProp.Builder |
builder()
Creates a builder to build a
RMSProp. |
void |
update(java.lang.String parameterId,
NDArray weight,
NDArray grad)
Updates the parameters according to the gradients.
|
adadelta, adagrad, adam, getWeightDecay, nag, rmsprop, sgd, updateCount, withDefaultStateprotected RmsProp(RmsProp.Builder builder)
RMSProp optimizer.builder - the builder to create a new instance of Adam optimizerpublic void update(java.lang.String parameterId,
NDArray weight,
NDArray grad)
public static RmsProp.Builder builder()
RMSProp.