Bidirectional instead. With the Bidirectional
layer wrapper you can make any recurrent layer bidirectional, in particular GravesLSTM. Note that this layer adds the
output of both directions, which translates into "ADD" mode in Bidirectional.
Usage: .layer(new Bidirectional(Bidirectional.Mode.ADD, new GravesLSTM.Builder()....build()))@Deprecated public class GravesBidirectionalLSTM extends BaseRecurrentLayer
| Modifier and Type | Class and Description |
|---|---|
static class |
GravesBidirectionalLSTM.Builder
Deprecated.
|
| Modifier and Type | Field and Description |
|---|---|
protected boolean |
helperAllowFallback
Deprecated.
|
rnnDataFormat, weightInitFnRecurrentnIn, nOut, timeDistributedFormatactivationFn, biasInit, biasUpdater, gainInit, gradientNormalization, gradientNormalizationThreshold, iUpdater, regularization, regularizationBias, weightInitFn, weightNoiseconstraints, iDropout, layerName| Modifier and Type | Method and Description |
|---|---|
LayerMemoryReport |
getMemoryReport(InputType inputType)
Deprecated.
This is a report of the estimated memory consumption for the given layer
|
protected void |
initializeConstraints(Layer.Builder<?> builder)
Deprecated.
Initialize the weight constraints.
|
ParamInitializer |
initializer()
Deprecated.
|
Layer |
instantiate(NeuralNetConfiguration conf,
Collection<TrainingListener> trainingListeners,
int layerIndex,
INDArray layerParamsView,
boolean initializeParams,
DataType networkDataType)
Deprecated.
|
getOutputType, getPreProcessorForInputType, setNInisPretrainParamclone, getGradientNormalization, getRegularizationByParam, getUpdaterByParam, resetLayerDefaultConfigsetDataTypeequals, finalize, getClass, hashCode, notify, notifyAll, toString, wait, wait, waitgetGradientNormalizationThreshold, getLayerNameprotected void initializeConstraints(Layer.Builder<?> builder)
LayerinitializeConstraints in class Layerpublic Layer instantiate(NeuralNetConfiguration conf, Collection<TrainingListener> trainingListeners, int layerIndex, INDArray layerParamsView, boolean initializeParams, DataType networkDataType)
instantiate in class Layerpublic ParamInitializer initializer()
initializer in class Layerpublic LayerMemoryReport getMemoryReport(InputType inputType)
LayergetMemoryReport in class LayerinputType - Input type to the layer. Memory consumption is often a function of the input
typeCopyright © 2020. All rights reserved.