From ef286fb0adb594d6601d75fdffcd5c23b4aa20b1 Mon Sep 17 00:00:00 2001 From: Jayesh Mahajan Date: Wed, 28 May 2025 14:11:44 -0400 Subject: [PATCH] Correct link for serving in ai/model-serving-tensorflow/README.md Co-authored-by: Janet Kuo --- ai/model-serving-tensorflow/README.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/ai/model-serving-tensorflow/README.md b/ai/model-serving-tensorflow/README.md index a0ed3de0..23a599db 100644 --- a/ai/model-serving-tensorflow/README.md +++ b/ai/model-serving-tensorflow/README.md @@ -2,7 +2,7 @@ ## 🎯 Purpose / What You'll Learn -This example demonstrates how to deploy a TensorFlow model for inference using [TensorFlow Serving](https://www.tensorflow.org/tfx/serving) on Kubernetes. You’ll learn how to: +This example demonstrates how to deploy a TensorFlow model for inference using [TensorFlow Serving](https://www.tensorflow.org/serving) on Kubernetes. You’ll learn how to: - Set up TensorFlow Serving with a pre-trained model - Use a PersistentVolume to mount your model directory