Skip to content

Commit

Permalink
docs: add warnning content in vllm page (#1014)
Browse files Browse the repository at this point in the history
Co-authored-by: Um Changyong <[email protected]>
  • Loading branch information
e7217 and Um Changyong authored Nov 29, 2024
1 parent 07f9752 commit d9edcde
Showing 1 changed file with 7 additions and 0 deletions.
7 changes: 7 additions & 0 deletions docs/source/nodes/generator/vllm.md
Original file line number Diff line number Diff line change
Expand Up @@ -64,3 +64,10 @@ Plus, you can use it over v0.2.16, so you must be upgrade to the latest version.
We are developing multi-gpu compatibility for AutoRAG now.
So, please wait for the full compatibilty to multi-gpu environment.
```
```{warning}
When using the vllm module, errors may occur depending on the configuration of PyTorch. In such cases, please follow the instructions below:
1. Define the vllm module to operate in a single-case mode.
2. Set the skip_validation parameter to True when using the start_trial function in the evaluator.
```

0 comments on commit d9edcde

Please sign in to comment.