| Class | Description |
|---|---|
| Abs |
Abs elementwise function
|
| ACos |
Log elementwise function
|
| ACosh |
ACosh elementwise function
|
| And |
Boolean AND pairwise transform
|
| ASin |
Arcsin elementwise function
|
| ASinh |
Arcsin elementwise function
|
| Assign |
Assign op: x = y, with broadcast as required
|
| ATan |
Arc Tangent elementwise function
|
| ATan2 |
Arc Tangent elementwise function
|
| ATanh |
tan elementwise function
|
| BaseDynamicTransformOp | |
| BatchToSpace |
N-dimensional batch to space operation.
|
| BinaryMinimalRelativeError | |
| BinaryRelativeError | |
| Ceil |
Ceiling elementwise function
|
| Constant | |
| Cos |
Cosine elementwise function
|
| Cosh |
Cosine Hyperbolic elementwise function
|
| Cube |
Cube (x^3) elementwise function
|
| Dilation2D |
Dilation2D op wrapper
|
| DynamicPartition |
Transforms a given input tensor into numPartitions partitions, as indicated by the indices in "partitions".
|
| DynamicStitch |
Transforms a given input tensor into numPartitions partitions, as indicated by the indices in "partitions".
|
| ELU |
ELU: Exponential Linear Unit (alpha=1.0)
Introduced in paper: Fast and Accurate Deep Network Learning by Exponential Linear Units (ELUs) Djork-Arné Clevert, Thomas Unterthiner, Sepp Hochreiter (2015) http://arxiv.org/abs/1511.07289 |
| Erf |
Gaussian error function (erf) function, which is defined as
|
| Erfc |
Complementary Gaussian error function (erfc), defined as
|
| Exp |
Element-wise exponential function
|
| Expm1 |
Element-wise exponential function minus 1, i.e.
|
| Fill |
Fill an array of given "shape" with the provided "value", e.g.
|
| Floor |
Floor elementwise function
|
| HardSigmoid |
HardSigmoid function
|
| HardTanh |
Hard tanh elementwise function
|
| Histogram | |
| Identity |
Identity function
|
| InvertPermutation |
Inverse of index permutation.
|
| IsFinite |
IsFinite function
|
| IsInf |
IsInf function
|
| IsMax |
[1, 2, 3, 1] -> [0, 0, 1, 0]
|
| IsNaN |
IsNaN function
|
| LeakyReLU |
Leaky Rectified linear unit.
|
| LegacyDropOut |
DropOut implementation as Op
PLEASE NOTE: This is legacy DropOut implementation, please consider using op with the same opName from randomOps
|
| LegacyDropOutInverted |
Inverted DropOut implementation as Op
PLEASE NOTE: This is legacy DropOutInverted implementation, please consider using op with the same opName from randomOps
|
| Log |
Log elementwise function
|
| Log1p |
Log1p function
|
| LogSigmoid |
LogSigmoid function
|
| LogSigmoidDerivative |
LogSigmoid derivative
|
| LogSoftMax |
Log(softmax(X))
|
| LogX |
Log on arbitrary base op
|
| MatchConditionTransform |
Absolute sum the components
|
| MatrixDiagPart | |
| MatrixSetDiag | |
| MaxOut |
Max out activation:
http://arxiv.org/pdf/1302.4389.pdf
|
| Negative |
Negative function
|
| Not |
Boolean AND pairwise transform
|
| OldAtan2Op |
atan2 operation
|
| OldIdentity |
Identity function
|
| OldReverse |
OldReverse op
|
| OldSoftMax |
Soft max function
row_maxes is a row vector (max for each row)
row_maxes = rowmaxes(input)
diff = exp(input - max) / diff.rowSums()
Outputs a probability distribution.
|
| OneMinus |
1 - input
|
| Or |
Boolean OR pairwise transform
|
| Pow |
Pow function
|
| PowDerivative |
Pow derivative
z = n * x ^ (n-1)
|
| RationalTanh |
Rational Tanh Approximation elementwise function, as described at https://github.com/deeplearning4j/libnd4j/issues/351
|
| Reciprocal |
Created by susaneraly on 3/28/18.
|
| RectifedLinear |
Rectified linear units
|
| RectifiedTanh |
RectifiedTanh
Essentially max(0, tanh(x))
|
| RelativeError | |
| Relu6 |
Rectified linear unit 6, i.e.
|
| ReluLayer |
Composed op: relu((X, W) + b)
|
| ReplaceNans |
Element-wise "Replace NaN" implementation as Op
|
| Reverse | |
| ReverseSequence |
Created by farizrahman4u on 3/16/18.
|
| Rint |
Rint function
|
| Round |
Rounding function
|
| RSqrt |
RSqrt function
|
| SELU |
SELU activation function
|
| Set |
Set
|
| SetRange |
Set range to a particular set of values
|
| Sigmoid |
Sigmoid function
|
| SigmoidDerivative |
Sigmoid derivative
|
| Sign |
Signum function
|
| Sin |
Log elementwise function
|
| Sinh |
Sinh function
|
| SoftMax |
Soft max function
row_maxes is a row vector (max for each row)
row_maxes = rowmaxes(input)
diff = exp(input - max) / diff.rowSums()
Outputs a probability distribution.
|
| SoftMaxDerivative |
Softmax derivative
|
| SoftPlus | |
| SoftSign |
Softsign element-wise activation function.
|
| SpaceToBatch |
N-dimensional space to batch operation.
|
| Sqrt |
Sqrt function
|
| Stabilize |
Stabilization function, forces values to be within a range
|
| Step |
Unit step function.
|
| Swish |
Swish function
|
| SwishDerivative |
Swish derivative
|
| Tan |
Tanh elementwise function
|
| TanDerivative |
Tan Derivative elementwise function
|
| Tanh |
Tanh elementwise function
|
| TanhDerivative |
Tanh derivative
|
| TimesOneMinus |
If x is input: output is x*(1-x)
|
| Xor |
Boolean XOR pairwise transform
|
| XwPlusB |
Composed op: mmul (X, W) + b
|
Copyright © 2018. All rights reserved.