vllm-project.github.io/assets/figures
qscqesze 1506191318
add minimax-m1 doc (#59)
* update minimax-m1.md

Signed-off-by: qingjun <qingjun@minimaxi.com>

* update

Signed-off-by: qingjun <qingjun@minimaxi.com>

* update

Signed-off-by: qingjun <qingjun@minimaxi.com>

* update

Signed-off-by: qingjun <qingjun@minimaxi.com>

* update

Signed-off-by: qingjun <qingjun@minimaxi.com>

* change title

Signed-off-by: qingjun <qingjun@minimaxi.com>

* update

Signed-off-by: qingjun <qingjun@minimaxi.com>

* update

Signed-off-by: qingjun <qingjun@minimaxi.com>

* Update _posts/2025-06-26-minimax-m1.md

* Update _posts/2025-06-26-minimax-m1.md

* Update _posts/2025-06-26-minimax-m1.md

Co-authored-by: Woosuk Kwon <woosuk.kwon@berkeley.edu>

---------

Signed-off-by: qingjun <qingjun@minimaxi.com>
Co-authored-by: youkaichao <youkaichao@gmail.com>
Co-authored-by: Woosuk Kwon <woosuk.kwon@berkeley.edu>
2025-07-01 10:16:33 +08:00
..
aibrix Add aibrix release blog post (#35) 2025-02-21 14:02:48 -08:00
distributed-inference Add distributed inference blog post (#27) 2025-02-19 17:29:26 +00:00
lfai update figure 2024-07-25 15:14:57 -07:00
llama-stack Move image to assets 2025-01-25 11:15:15 -05:00
llama4 llama4 post (#47) 2025-04-05 23:44:57 -07:00
llama31 Add Llama 3.1 blogpost (new files) 2024-07-25 13:35:26 -07:00
minimax-m1 add minimax-m1 doc (#59) 2025-07-01 10:16:33 +08:00
notes-vllm-vs-deepspeed Use new template for the website 2023-11-14 12:12:47 -08:00
openrlhf-vllm Add OpenRLHF blog (#54) 2025-04-24 15:13:04 +08:00
perf-v060 Add v0.6.0 perf blog and also modify readme on how to publish a blogpost 2024-09-04 23:57:30 -07:00
ptpc add ptpc-fp8 amd blogpost (#45) 2025-03-20 17:22:42 +00:00
spec-decode minor 2024-10-22 11:39:37 -07:00
stack Add files via upload 2025-01-24 11:27:27 -06:00
struct-decode-intro fix: correct dates for posts 2025-01-10 23:29:44 -05:00
transformers-backend [Add] Blog post on transformers backend integration with vLLM (#50) 2025-04-16 12:31:00 +01:00
v1 Update qwen2vl 2025-01-26 22:48:26 -08:00
vllm-2024-wrapped-2025-roadmap vLLM 2024 Retrospective and 2025 Vision Blog 2025-01-10 15:49:36 -05:00
vllm-serving-amd add 2024-10-23-vllm-serving-amd blog post 2024-10-23 10:29:32 +00:00
annimation0.gif first commit 2023-06-21 23:36:19 +08:00
annimation1.gif first commit 2023-06-21 23:36:19 +08:00
annimation2.gif first commit 2023-06-21 23:36:19 +08:00
annimation3.gif first commit 2023-06-21 23:36:19 +08:00
lmsys_traffic.png first commit 2023-06-21 23:36:19 +08:00
perf_a10g_n1_dark.png first commit 2023-06-21 23:36:19 +08:00
perf_a10g_n1_light.png first commit 2023-06-21 23:36:19 +08:00
perf_a10g_n3_dark.png first commit 2023-06-21 23:36:19 +08:00
perf_a10g_n3_light.png first commit 2023-06-21 23:36:19 +08:00
perf_a100_n1_dark.png first commit 2023-06-21 23:36:19 +08:00
perf_a100_n1_light.png first commit 2023-06-21 23:36:19 +08:00
perf_a100_n3_dark.png first commit 2023-06-21 23:36:19 +08:00
perf_a100_n3_light.png first commit 2023-06-21 23:36:19 +08:00