vllm/csrc/attention
Tyler Michael Smith e8c3bd2cd1
[Bugfix] Fix some narrowing conversion warnings (#20141)
Signed-off-by: Tyler Michael Smith <tyler@neuralmagic.com>
2025-06-27 09:01:28 -07:00
..
mla [Bugfix] Fix some narrowing conversion warnings (#20141) 2025-06-27 09:01:28 -07:00
attention_dtypes.h Enable scaled FP8 (e4m3fn) KV cache on ROCm (AMD GPU) (#3290) 2024-04-03 14:15:55 -07:00
attention_generic.cuh [CI/Build] Enforce style for C++ and CUDA code with `clang-format` (#4722) 2024-05-22 07:18:41 +00:00
attention_kernels.cuh fix: typos (#18151) 2025-05-15 02:16:15 -07:00
attention_utils.cuh [AMD][CI/Build] Disambiguation of the function call for ROCm 6.2 headers compatibility (#7477) 2024-08-21 16:47:36 -07:00
dtype_bfloat16.cuh [CI/Build] Suppress divide-by-zero and missing return statement warnings (#7001) 2024-08-05 16:00:01 -04:00
dtype_float16.cuh [CI/Build] Enforce style for C++ and CUDA code with `clang-format` (#4722) 2024-05-22 07:18:41 +00:00
dtype_float32.cuh [CI/Build] Enforce style for C++ and CUDA code with `clang-format` (#4722) 2024-05-22 07:18:41 +00:00
dtype_fp8.cuh [CI/Build] Enforce style for C++ and CUDA code with `clang-format` (#4722) 2024-05-22 07:18:41 +00:00
merge_attn_states.cu [BugFix] FA2 MLA Accuracy Issue (#18807) 2025-05-28 08:59:39 +00:00
paged_attention_v1.cu [MISC] Remove unused variableds in C++ (#19609) 2025-06-15 20:05:28 -07:00
paged_attention_v2.cu [MISC] Remove unused variableds in C++ (#19609) 2025-06-15 20:05:28 -07:00
vertical_slash_index.cu Implements dual-chunk-flash-attn backend for dual chunk attention with sparse attention support (#11844) 2025-05-12 19:52:47 -07:00