public class ActivationELU extends BaseActivationFunction
| Modifier and Type | Field and Description |
|---|---|
static double |
DEFAULT_ALPHA |
| Constructor and Description |
|---|
ActivationELU() |
ActivationELU(double alpha) |
| Modifier and Type | Method and Description |
|---|---|
org.nd4j.linalg.primitives.Pair<INDArray,INDArray> |
backprop(INDArray in,
INDArray epsilon)
Backpropagate the errors through the activation function, given input z and epsilon dL/da.
Returns 2 INDArrays: (a) The gradient dL/dz, calculated from dL/da, and (b) The parameter gradients dL/dw, where w is the weights in the activation function. |
INDArray |
getActivation(INDArray in,
boolean training)
Carry out activation function on the input array (usually known as 'preOut' or 'z')
Implementations must overwrite "in", transform in place and return "in"
Can support separate behaviour during test
|
String |
toString() |
getGradientViewArray, getParametersViewArray, numParams, setGradientViewArray, setParametersViewArraypublic static final double DEFAULT_ALPHA
public ActivationELU()
public ActivationELU(double alpha)
public INDArray getActivation(INDArray in, boolean training)
IActivationpublic org.nd4j.linalg.primitives.Pair<INDArray,INDArray> backprop(INDArray in, INDArray epsilon)
IActivationin - Input, before applying the activation function (z, or 'preOut')epsilon - Gradient to be backpropagated: dL/da, where L is the loss functionCopyright © 2018. All rights reserved.