@Operator public final class ResourceApplyAdagradDA extends PrimitiveOp
| Modifier and Type | Class and Description |
|---|---|
static class |
ResourceApplyAdagradDA.Options
Optional attributes for
ResourceApplyAdagradDA |
operation| Modifier and Type | Method and Description |
|---|---|
static <T> ResourceApplyAdagradDA |
create(Scope scope,
Operand<?> var,
Operand<?> gradientAccumulator,
Operand<?> gradientSquaredAccumulator,
Operand<T> grad,
Operand<T> lr,
Operand<T> l1,
Operand<T> l2,
Operand<Long> globalStep,
ResourceApplyAdagradDA.Options... options)
Factory method to create a class to wrap a new ResourceApplyAdagradDA operation to the graph.
|
static ResourceApplyAdagradDA.Options |
useLocking(Boolean useLocking) |
equals, hashCode, toStringpublic static <T> ResourceApplyAdagradDA create(Scope scope, Operand<?> var, Operand<?> gradientAccumulator, Operand<?> gradientSquaredAccumulator, Operand<T> grad, Operand<T> lr, Operand<T> l1, Operand<T> l2, Operand<Long> globalStep, ResourceApplyAdagradDA.Options... options)
scope - current graph scopevar - Should be from a Variable().gradientAccumulator - Should be from a Variable().gradientSquaredAccumulator - Should be from a Variable().grad - The gradient.lr - Scaling factor. Must be a scalar.l1 - L1 regularization. Must be a scalar.l2 - L2 regularization. Must be a scalar.globalStep - Training step number. Must be a scalar.options - carries optional attributes valuespublic static ResourceApplyAdagradDA.Options useLocking(Boolean useLocking)
useLocking - If True, updating of the var and accum tensors will be protected by
a lock; otherwise the behavior is undefined, but may exhibit less contention.Copyright © 2015–2019. All rights reserved.