vllm/docs/deployment/integrations/kserve.md

328 B

title
KServe

{ #deployment-kserve }

vLLM can be deployed with KServe on Kubernetes for highly scalable distributed model serving.

Please see this guide for more details on using vLLM with KServe.