vllm/tests
SangBin Cho b51c1cc9d2
[2/N] Chunked prefill data update (#3538)
2024-03-28 10:06:01 -07:00
..
async_engine [CI] Try introducing isort. (#3495) 2024-03-25 07:59:47 -07:00
basic_correctness [1/n][Chunked Prefill] Refactor input query shapes (#3236) 2024-03-20 14:46:05 -07:00
core [2/N] Chunked prefill data update (#3538) 2024-03-28 10:06:01 -07:00
distributed [Core] remove cupy dependency (#3625) 2024-03-27 00:33:26 -07:00
engine Asynchronous tokenization (#2879) 2024-03-15 23:37:01 +00:00
entrypoints [Bugfix] API stream returning two stops (#3450) 2024-03-25 10:14:34 -07:00
kernels feat(benchmarks): Add Prefix Caching Benchmark to Serving Benchmark (#3277) 2024-03-27 13:39:26 -07:00
lora [Kernel] support non-zero cuda devices in punica kernels (#3636) 2024-03-27 00:37:42 +00:00
metrics Re-enable the 80 char line width limit (#3305) 2024-03-10 19:49:14 -07:00
models [Feature] Add vision language model support. (#3042) 2024-03-25 14:16:30 -07:00
prefix_caching [Core][Bugfix]Refactor block manager for better testability (#3492) 2024-03-27 23:59:28 -07:00
prompts [BugFix] Fix input positions for long context with sliding window (#2088) 2023-12-13 12:28:13 -08:00
samplers [Misc] Include matched stop string/token in responses (#2976) 2024-03-25 17:31:32 -07:00
spec_decode [Feature] Add vision language model support. (#3042) 2024-03-25 14:16:30 -07:00
tokenization [CI] Try introducing isort. (#3495) 2024-03-25 07:59:47 -07:00
worker [2/N] Chunked prefill data update (#3538) 2024-03-28 10:06:01 -07:00
__init__.py [Small] Formatter only checks lints in changed files (#1528) 2023-10-31 15:39:38 -07:00
conftest.py [2/N] Chunked prefill data update (#3538) 2024-03-28 10:06:01 -07:00
test_cache_block_hashing.py [CI] Try introducing isort. (#3495) 2024-03-25 07:59:47 -07:00
test_config.py Fix assertion failure in Qwen 1.5 with prefix caching enabled (#3373) 2024-03-14 13:56:57 -07:00
test_logits_processor.py Migrate `logits` computation and gather to `model_runner` (#3233) 2024-03-20 23:25:01 +00:00
test_regression.py [BugFix] Fix GC bug for `LLM` class (#2882) 2024-02-14 22:17:44 -08:00
test_sampling_params.py [Bugfix] fix crash if max_tokens=None (#2570) 2024-01-23 22:38:55 -08:00
test_sequence.py [2/N] Chunked prefill data update (#3538) 2024-03-28 10:06:01 -07:00