Skip to content
Merged
Show file tree
Hide file tree
Changes from 1 commit
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
4 changes: 2 additions & 2 deletions doc/frameworks/tensorflow/deploying_tensorflow_serving.rst
Original file line number Diff line number Diff line change
Expand Up @@ -64,7 +64,7 @@ If you already have existing model artifacts in S3, you can skip training and de

from sagemaker.tensorflow import TensorFlowModel

model = TensorFlowModel(model_data='s3://mybucket/model.tar.gz', role='MySageMakerRole')
model = TensorFlowModel(model_data='s3://mybucket/model.tar.gz', role='MySageMakerRole', framework_version='MyFrameworkVersion')
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

is framework_version expect an int? can we do a dummy int 0.0.0 or x.x.x

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

updated to x.x.x


predictor = model.deploy(initial_instance_count=1, instance_type='ml.c5.xlarge')

Expand All @@ -74,7 +74,7 @@ Python-based TensorFlow serving on SageMaker has support for `Elastic Inference

from sagemaker.tensorflow import TensorFlowModel

model = TensorFlowModel(model_data='s3://mybucket/model.tar.gz', role='MySageMakerRole')
model = TensorFlowModel(model_data='s3://mybucket/model.tar.gz', role='MySageMakerRole', framework_version='MyFrameworkVersion')

predictor = model.deploy(initial_instance_count=1, instance_type='ml.c5.xlarge', accelerator_type='ml.eia1.medium')

Expand Down
13 changes: 8 additions & 5 deletions doc/frameworks/tensorflow/using_tf.rst
Original file line number Diff line number Diff line change
Expand Up @@ -468,7 +468,7 @@ If you already have existing model artifacts in S3, you can skip training and de

from sagemaker.tensorflow import TensorFlowModel

model = TensorFlowModel(model_data='s3://mybucket/model.tar.gz', role='MySageMakerRole')
model = TensorFlowModel(model_data='s3://mybucket/model.tar.gz', role='MySageMakerRole', framework_version='MyFrameworkVersion')

predictor = model.deploy(initial_instance_count=1, instance_type='ml.c5.xlarge')

Expand All @@ -478,7 +478,7 @@ Python-based TensorFlow serving on SageMaker has support for `Elastic Inference

from sagemaker.tensorflow import TensorFlowModel

model = TensorFlowModel(model_data='s3://mybucket/model.tar.gz', role='MySageMakerRole')
model = TensorFlowModel(model_data='s3://mybucket/model.tar.gz', role='MySageMakerRole', framework_version='MyFrameworkVersion')

predictor = model.deploy(initial_instance_count=1, instance_type='ml.c5.xlarge', accelerator_type='ml.eia1.medium')

Expand Down Expand Up @@ -767,7 +767,8 @@ This customized Python code must be named ``inference.py`` and is specified thro

model = TensorFlowModel(entry_point='inference.py',
model_data='s3://mybucket/model.tar.gz',
role='MySageMakerRole')
role='MySageMakerRole',
framework_version='MyFrameworkVersion')

In the example above, ``inference.py`` is assumed to be a file inside ``model.tar.gz``. If you want to use a local file instead, you must add the ``source_dir`` argument. See the documentation on `TensorFlowModel <https://sagemaker.readthedocs.io/en/stable/frameworks/tensorflow/sagemaker.tensorflow.html#sagemaker.tensorflow.model.TensorFlowModel>`_.

Expand Down Expand Up @@ -923,7 +924,8 @@ processing. There are 2 ways to do this:
model = TensorFlowModel(entry_point='inference.py',
dependencies=['requirements.txt'],
model_data='s3://mybucket/model.tar.gz',
role='MySageMakerRole')
role='MySageMakerRole',
framework_version='MyFrameworkVersion')


2. If you are working in a network-isolation situation or if you don't
Expand All @@ -941,7 +943,8 @@ processing. There are 2 ways to do this:
model = TensorFlowModel(entry_point='inference.py',
dependencies=['/path/to/folder/named/lib'],
model_data='s3://mybucket/model.tar.gz',
role='MySageMakerRole')
role='MySageMakerRole',
framework_version='MyFrameworkVersion')

For more information, see: https://github.com/aws/sagemaker-tensorflow-serving-container#prepost-processing

Expand Down
Loading