From d76e1989e3a5602e354ef2be01d6eb03312c95eb Mon Sep 17 00:00:00 2001 From: mgoin Date: Fri, 10 Jan 2025 15:53:56 -0500 Subject: [PATCH] Update Signed-off-by: mgoin --- ...025-vision.md => 2025-01-09-vllm-2024-wrapped-2025-vision.md} | 1 + 1 file changed, 1 insertion(+) rename _posts/{2025-01-13-vllm-2024-wrapped-2025-vision.md => 2025-01-09-vllm-2024-wrapped-2025-vision.md} (99%) diff --git a/_posts/2025-01-13-vllm-2024-wrapped-2025-vision.md b/_posts/2025-01-09-vllm-2024-wrapped-2025-vision.md similarity index 99% rename from _posts/2025-01-13-vllm-2024-wrapped-2025-vision.md rename to _posts/2025-01-09-vllm-2024-wrapped-2025-vision.md index 945978b..7365584 100644 --- a/_posts/2025-01-13-vllm-2024-wrapped-2025-vision.md +++ b/_posts/2025-01-09-vllm-2024-wrapped-2025-vision.md @@ -2,6 +2,7 @@ layout: post title: "vLLM 2024 Retrospective and 2025 Vision" author: "vLLM Team" +image: /assets/figures/vllm-2024-wrapped-2025-roadmap/model-architecture-serving-usage.png --- The vLLM community has achieved remarkable growth in 2024, evolving from a specialized inference engine to becoming the de facto serving solution for the open-source AI ecosystem. Our growth metrics demonstrate significant progress: