Correct link for serving in ai/model-serving-tensorflow/README.md
Co-authored-by: Janet Kuo <chiachenk@google.com>
This commit is contained in:
parent
01cf1c41cf
commit
ef286fb0ad
|
@ -2,7 +2,7 @@
|
|||
|
||||
## 🎯 Purpose / What You'll Learn
|
||||
|
||||
This example demonstrates how to deploy a TensorFlow model for inference using [TensorFlow Serving](https://www.tensorflow.org/tfx/serving) on Kubernetes. You’ll learn how to:
|
||||
This example demonstrates how to deploy a TensorFlow model for inference using [TensorFlow Serving](https://www.tensorflow.org/serving) on Kubernetes. You’ll learn how to:
|
||||
|
||||
- Set up TensorFlow Serving with a pre-trained model
|
||||
- Use a PersistentVolume to mount your model directory
|
||||
|
|
Loading…
Reference in New Issue