Class ContainerDefinition
- java.lang.Object
-
- software.amazon.awssdk.services.sagemaker.model.ContainerDefinition
-
- All Implemented Interfaces:
Serializable,SdkPojo,ToCopyableBuilder<ContainerDefinition.Builder,ContainerDefinition>
@Generated("software.amazon.awssdk:codegen") public final class ContainerDefinition extends Object implements SdkPojo, Serializable, ToCopyableBuilder<ContainerDefinition.Builder,ContainerDefinition>
Describes the container, as part of model definition.
- See Also:
- Serialized Form
-
-
Nested Class Summary
Nested Classes Modifier and Type Class Description static interfaceContainerDefinition.Builder
-
Method Summary
All Methods Static Methods Instance Methods Concrete Methods Modifier and Type Method Description static ContainerDefinition.Builderbuilder()StringcontainerHostname()This parameter is ignored for models that contain only aPrimaryContainer.Map<String,String>environment()The environment variables to set in the Docker container.booleanequals(Object obj)booleanequalsBySdkFields(Object obj)<T> Optional<T>getValueForField(String fieldName, Class<T> clazz)booleanhasEnvironment()For responses, this returns true if the service returned a value for the Environment property.inthashCode()Stringimage()The path where inference code is stored.ImageConfigimageConfig()Specifies whether the model container is in Amazon ECR or a private Docker registry accessible from your Amazon Virtual Private Cloud (VPC).StringinferenceSpecificationName()The inference specification name in the model package version.ContainerModemode()Whether the container hosts a single model or multiple models.StringmodeAsString()Whether the container hosts a single model or multiple models.ModelDataSourcemodelDataSource()Specifies the location of ML model data to deploy.StringmodelDataUrl()The S3 path where the model artifacts, which result from model training, are stored.StringmodelPackageName()The name or Amazon Resource Name (ARN) of the model package to use to create the model.MultiModelConfigmultiModelConfig()Specifies additional configuration for multi-model endpoints.List<SdkField<?>>sdkFields()static Class<? extends ContainerDefinition.Builder>serializableBuilderClass()ContainerDefinition.BuildertoBuilder()StringtoString()Returns a string representation of this object.-
Methods inherited from class java.lang.Object
clone, finalize, getClass, notify, notifyAll, wait, wait, wait
-
Methods inherited from interface software.amazon.awssdk.utils.builder.ToCopyableBuilder
copy
-
-
-
-
Method Detail
-
containerHostname
public final String containerHostname()
This parameter is ignored for models that contain only a
PrimaryContainer.When a
ContainerDefinitionis part of an inference pipeline, the value of the parameter uniquely identifies the container for the purposes of logging and metrics. For information, see Use Logs and Metrics to Monitor an Inference Pipeline. If you don't specify a value for this parameter for aContainerDefinitionthat is part of an inference pipeline, a unique name is automatically assigned based on the position of theContainerDefinitionin the pipeline. If you specify a value for theContainerHostNamefor anyContainerDefinitionthat is part of an inference pipeline, you must specify a value for theContainerHostNameparameter of everyContainerDefinitionin that pipeline.- Returns:
- This parameter is ignored for models that contain only a
PrimaryContainer.When a
ContainerDefinitionis part of an inference pipeline, the value of the parameter uniquely identifies the container for the purposes of logging and metrics. For information, see Use Logs and Metrics to Monitor an Inference Pipeline. If you don't specify a value for this parameter for aContainerDefinitionthat is part of an inference pipeline, a unique name is automatically assigned based on the position of theContainerDefinitionin the pipeline. If you specify a value for theContainerHostNamefor anyContainerDefinitionthat is part of an inference pipeline, you must specify a value for theContainerHostNameparameter of everyContainerDefinitionin that pipeline.
-
image
public final String image()
The path where inference code is stored. This can be either in Amazon EC2 Container Registry or in a Docker registry that is accessible from the same VPC that you configure for your endpoint. If you are using your own custom algorithm instead of an algorithm provided by SageMaker, the inference code must meet SageMaker requirements. SageMaker supports both
registry/repository[:tag]andregistry/repository[@digest]image path formats. For more information, see Using Your Own Algorithms with Amazon SageMaker.The model artifacts in an Amazon S3 bucket and the Docker image for inference container in Amazon EC2 Container Registry must be in the same region as the model or endpoint you are creating.
- Returns:
- The path where inference code is stored. This can be either in Amazon EC2 Container Registry or in a
Docker registry that is accessible from the same VPC that you configure for your endpoint. If you are
using your own custom algorithm instead of an algorithm provided by SageMaker, the inference code must
meet SageMaker requirements. SageMaker supports both
registry/repository[:tag]andregistry/repository[@digest]image path formats. For more information, see Using Your Own Algorithms with Amazon SageMaker.The model artifacts in an Amazon S3 bucket and the Docker image for inference container in Amazon EC2 Container Registry must be in the same region as the model or endpoint you are creating.
-
imageConfig
public final ImageConfig imageConfig()
Specifies whether the model container is in Amazon ECR or a private Docker registry accessible from your Amazon Virtual Private Cloud (VPC). For information about storing containers in a private Docker registry, see Use a Private Docker Registry for Real-Time Inference Containers.
The model artifacts in an Amazon S3 bucket and the Docker image for inference container in Amazon EC2 Container Registry must be in the same region as the model or endpoint you are creating.
- Returns:
- Specifies whether the model container is in Amazon ECR or a private Docker registry accessible from your
Amazon Virtual Private Cloud (VPC). For information about storing containers in a private Docker
registry, see Use a
Private Docker Registry for Real-Time Inference Containers.
The model artifacts in an Amazon S3 bucket and the Docker image for inference container in Amazon EC2 Container Registry must be in the same region as the model or endpoint you are creating.
-
mode
public final ContainerMode mode()
Whether the container hosts a single model or multiple models.
If the service returns an enum value that is not available in the current SDK version,
modewill returnContainerMode.UNKNOWN_TO_SDK_VERSION. The raw value returned by the service is available frommodeAsString().- Returns:
- Whether the container hosts a single model or multiple models.
- See Also:
ContainerMode
-
modeAsString
public final String modeAsString()
Whether the container hosts a single model or multiple models.
If the service returns an enum value that is not available in the current SDK version,
modewill returnContainerMode.UNKNOWN_TO_SDK_VERSION. The raw value returned by the service is available frommodeAsString().- Returns:
- Whether the container hosts a single model or multiple models.
- See Also:
ContainerMode
-
modelDataUrl
public final String modelDataUrl()
The S3 path where the model artifacts, which result from model training, are stored. This path must point to a single gzip compressed tar archive (.tar.gz suffix). The S3 path is required for SageMaker built-in algorithms, but not if you use your own algorithms. For more information on built-in algorithms, see Common Parameters.
The model artifacts must be in an S3 bucket that is in the same region as the model or endpoint you are creating.
If you provide a value for this parameter, SageMaker uses Amazon Web Services Security Token Service to download model artifacts from the S3 path you provide. Amazon Web Services STS is activated in your Amazon Web Services account by default. If you previously deactivated Amazon Web Services STS for a region, you need to reactivate Amazon Web Services STS for that region. For more information, see Activating and Deactivating Amazon Web Services STS in an Amazon Web Services Region in the Amazon Web Services Identity and Access Management User Guide.
If you use a built-in algorithm to create a model, SageMaker requires that you provide a S3 path to the model artifacts in
ModelDataUrl.- Returns:
- The S3 path where the model artifacts, which result from model training, are stored. This path must point
to a single gzip compressed tar archive (.tar.gz suffix). The S3 path is required for SageMaker built-in
algorithms, but not if you use your own algorithms. For more information on built-in algorithms, see Common
Parameters.
The model artifacts must be in an S3 bucket that is in the same region as the model or endpoint you are creating.
If you provide a value for this parameter, SageMaker uses Amazon Web Services Security Token Service to download model artifacts from the S3 path you provide. Amazon Web Services STS is activated in your Amazon Web Services account by default. If you previously deactivated Amazon Web Services STS for a region, you need to reactivate Amazon Web Services STS for that region. For more information, see Activating and Deactivating Amazon Web Services STS in an Amazon Web Services Region in the Amazon Web Services Identity and Access Management User Guide.
If you use a built-in algorithm to create a model, SageMaker requires that you provide a S3 path to the model artifacts in
ModelDataUrl.
-
modelDataSource
public final ModelDataSource modelDataSource()
Specifies the location of ML model data to deploy.
Currently you cannot use
ModelDataSourcein conjunction with SageMaker batch transform, SageMaker serverless endpoints, SageMaker multi-model endpoints, and SageMaker Marketplace.- Returns:
- Specifies the location of ML model data to deploy.
Currently you cannot use
ModelDataSourcein conjunction with SageMaker batch transform, SageMaker serverless endpoints, SageMaker multi-model endpoints, and SageMaker Marketplace.
-
hasEnvironment
public final boolean hasEnvironment()
For responses, this returns true if the service returned a value for the Environment property. This DOES NOT check that the value is non-empty (for which, you should check theisEmpty()method on the property). This is useful because the SDK will never return a null collection or map, but you may need to differentiate between the service returning nothing (or null) and the service returning an empty collection or map. For requests, this returns true if a value for the property was specified in the request builder, and false if a value was not specified.
-
environment
public final Map<String,String> environment()
The environment variables to set in the Docker container.
The maximum length of each key and value in the
Environmentmap is 1024 bytes. The maximum length of all keys and values in the map, combined, is 32 KB. If you pass multiple containers to aCreateModelrequest, then the maximum length of all of their maps, combined, is also 32 KB.Attempts to modify the collection returned by this method will result in an UnsupportedOperationException.
This method will never return null. If you would like to know whether the service returned this field (so that you can differentiate between null and empty), you can use the
hasEnvironment()method.- Returns:
- The environment variables to set in the Docker container.
The maximum length of each key and value in the
Environmentmap is 1024 bytes. The maximum length of all keys and values in the map, combined, is 32 KB. If you pass multiple containers to aCreateModelrequest, then the maximum length of all of their maps, combined, is also 32 KB.
-
modelPackageName
public final String modelPackageName()
The name or Amazon Resource Name (ARN) of the model package to use to create the model.
- Returns:
- The name or Amazon Resource Name (ARN) of the model package to use to create the model.
-
inferenceSpecificationName
public final String inferenceSpecificationName()
The inference specification name in the model package version.
- Returns:
- The inference specification name in the model package version.
-
multiModelConfig
public final MultiModelConfig multiModelConfig()
Specifies additional configuration for multi-model endpoints.
- Returns:
- Specifies additional configuration for multi-model endpoints.
-
toBuilder
public ContainerDefinition.Builder toBuilder()
- Specified by:
toBuilderin interfaceToCopyableBuilder<ContainerDefinition.Builder,ContainerDefinition>
-
builder
public static ContainerDefinition.Builder builder()
-
serializableBuilderClass
public static Class<? extends ContainerDefinition.Builder> serializableBuilderClass()
-
equalsBySdkFields
public final boolean equalsBySdkFields(Object obj)
- Specified by:
equalsBySdkFieldsin interfaceSdkPojo
-
toString
public final String toString()
Returns a string representation of this object. This is useful for testing and debugging. Sensitive data will be redacted from this string using a placeholder value.
-
-