vllm/docs/source/deployment/integrations/kserve.md

310 B

(deployment-kserve)=

KServe

vLLM can be deployed with KServe on Kubernetes for highly scalable distributed model serving.

Please see this guide for more details on using vLLM with KServe.