This website requires JavaScript.
Explore
Help
Register
Sign In
vLLM
/
vllm
mirror of
https://github.com/vllm-project/vllm.git
Watch
1
Star
0
Fork
You've already forked vllm
0
Code
Issues
Packages
Projects
Releases
Wiki
Activity
6,557
Commits
73
Branches
67
Tags
518
MiB
55f1a468d9
Commit Graph
1 Commits
Author
SHA1
Message
Date
Tao He
60f7624334
Implements dual-chunk-flash-attn backend for dual chunk attention with sparse attention support (
#11844
)
2025-05-12 19:52:47 -07:00