Minor
Signed-off-by: WoosukKwon <woosuk.kwon@berkeley.edu>
This commit is contained in:
parent
d67eaaa0d9
commit
a8f7abcc58
|
@ -118,7 +118,7 @@ Finally, please note that you can continue using V0 and maintain backward compat
|
|||
To use vLLM V1:
|
||||
1. Install the latest version of vLLM with `pip install vllm --upgrade`.
|
||||
2. **Set the environment variable `export VLLM_USE_V1=1`.**
|
||||
3. Use vLLM’s [Python interface](https://github.com/vllm-project/vllm/blob/main/examples/offline_inference/basic.py) or OpenAI-compatible server (`vllm serve <model-name>`). You don’t need any change to the existing API.
|
||||
3. Use vLLM’s [Python API](https://github.com/vllm-project/vllm/blob/main/examples/offline_inference/basic.py) or OpenAI-compatible server (`vllm serve <model-name>`). You don’t need any change to the existing API.
|
||||
|
||||
Please try it out and share your feedback!
|
||||
|
||||
|
|
Loading…
Reference in New Issue