@Generated(value="jsii-pacmak/1.67.0 (build 2c027f5)", date="2022-09-19T20:26:40.730Z") @Stability(value=Stable) public interface CfnInferenceSchedulerProps extends software.amazon.jsii.JsiiSerializable
Example:
// The code below shows an example of how to instantiate this type.
// The values are placeholders you should change.
import software.amazon.awscdk.services.lookoutequipment.*;
Object dataInputConfiguration;
Object dataOutputConfiguration;
CfnInferenceSchedulerProps cfnInferenceSchedulerProps = CfnInferenceSchedulerProps.builder()
.dataInputConfiguration(dataInputConfiguration)
.dataOutputConfiguration(dataOutputConfiguration)
.dataUploadFrequency("dataUploadFrequency")
.modelName("modelName")
.roleArn("roleArn")
// the properties below are optional
.dataDelayOffsetInMinutes(123)
.inferenceSchedulerName("inferenceSchedulerName")
.serverSideKmsKeyId("serverSideKmsKeyId")
.tags(List.of(CfnTag.builder()
.key("key")
.value("value")
.build()))
.build();
| Modifier and Type | Interface and Description |
|---|---|
static class |
CfnInferenceSchedulerProps.Builder
A builder for
CfnInferenceSchedulerProps |
static class |
CfnInferenceSchedulerProps.Jsii$Proxy
An implementation for
CfnInferenceSchedulerProps |
| Modifier and Type | Method and Description |
|---|---|
static CfnInferenceSchedulerProps.Builder |
builder() |
default Number |
getDataDelayOffsetInMinutes()
A period of time (in minutes) by which inference on the data is delayed after the data starts.
|
Object |
getDataInputConfiguration()
Specifies configuration information for the input data for the inference scheduler, including delimiter, format, and dataset location.
|
Object |
getDataOutputConfiguration()
Specifies configuration information for the output results for the inference scheduler, including the Amazon S3 location for the output.
|
String |
getDataUploadFrequency()
How often data is uploaded to the source S3 bucket for the input data.
|
default String |
getInferenceSchedulerName()
The name of the inference scheduler.
|
String |
getModelName()
The name of the ML model used for the inference scheduler.
|
String |
getRoleArn()
The Amazon Resource Name (ARN) of a role with permission to access the data source being used for the inference.
|
default String |
getServerSideKmsKeyId()
Provides the identifier of the AWS KMS key used to encrypt inference scheduler data by Amazon Lookout for Equipment .
|
default List<CfnTag> |
getTags()
Any tags associated with the inference scheduler.
|
@Stability(value=Stable) @NotNull Object getDataInputConfiguration()
@Stability(value=Stable) @NotNull Object getDataOutputConfiguration()
@Stability(value=Stable) @NotNull String getDataUploadFrequency()
This value is the length of time between data uploads. For instance, if you select 5 minutes, Amazon Lookout for Equipment will upload the real-time data to the source bucket once every 5 minutes. This frequency also determines how often Amazon Lookout for Equipment starts a scheduled inference on your data. In this example, it starts once every 5 minutes.
@Stability(value=Stable) @NotNull String getModelName()
@Stability(value=Stable) @NotNull String getRoleArn()
@Stability(value=Stable) @Nullable default Number getDataDelayOffsetInMinutes()
For instance, if an offset delay time of five minutes was selected, inference will not begin on the data until the first data measurement after the five minute mark. For example, if five minutes is selected, the inference scheduler will wake up at the configured frequency with the additional five minute delay time to check the customer S3 bucket. The customer can upload data at the same frequency and they don't need to stop and restart the scheduler when uploading new data.
@Stability(value=Stable) @Nullable default String getInferenceSchedulerName()
@Stability(value=Stable) @Nullable default String getServerSideKmsKeyId()
@Stability(value=Stable) @Nullable default List<CfnTag> getTags()
For more information, see Tag .
@Stability(value=Stable) static CfnInferenceSchedulerProps.Builder builder()
CfnInferenceSchedulerProps.Builder of CfnInferenceSchedulerPropsCopyright © 2022. All rights reserved.