| Package | Description |
|---|---|
| org.tensorflow.op | |
| org.tensorflow.op.core |
| Modifier and Type | Method and Description |
|---|---|
<T> ApplyAdagradDA<T> |
Ops.applyAdagradDA(Operand<T> var,
Operand<T> gradientAccumulator,
Operand<T> gradientSquaredAccumulator,
Operand<T> grad,
Operand<T> lr,
Operand<T> l1,
Operand<T> l2,
Operand<Long> globalStep,
ApplyAdagradDA.Options... options)
Adds an
ApplyAdagradDA operation to the graph |
| Modifier and Type | Method and Description |
|---|---|
static <T> ApplyAdagradDA<T> |
ApplyAdagradDA.create(Scope scope,
Operand<T> var,
Operand<T> gradientAccumulator,
Operand<T> gradientSquaredAccumulator,
Operand<T> grad,
Operand<T> lr,
Operand<T> l1,
Operand<T> l2,
Operand<Long> globalStep,
ApplyAdagradDA.Options... options)
Factory method to create a class to wrap a new ApplyAdagradDA operation to the graph.
|
Copyright © 2015–2019. All rights reserved.