diff --git a/_posts/2025-01-13-vllm-2024-wrapped-2025-vision.md b/_posts/2025-01-09-vllm-2024-wrapped-2025-vision.md similarity index 99% rename from _posts/2025-01-13-vllm-2024-wrapped-2025-vision.md rename to _posts/2025-01-09-vllm-2024-wrapped-2025-vision.md index 945978b..7365584 100644 --- a/_posts/2025-01-13-vllm-2024-wrapped-2025-vision.md +++ b/_posts/2025-01-09-vllm-2024-wrapped-2025-vision.md @@ -2,6 +2,7 @@ layout: post title: "vLLM 2024 Retrospective and 2025 Vision" author: "vLLM Team" +image: /assets/figures/vllm-2024-wrapped-2025-roadmap/model-architecture-serving-usage.png --- The vLLM community has achieved remarkable growth in 2024, evolving from a specialized inference engine to becoming the de facto serving solution for the open-source AI ecosystem. Our growth metrics demonstrate significant progress: