Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

vllm使用 #6464

Closed
Daybreak-Zheng opened this issue Dec 27, 2024 · 1 comment
Closed

vllm使用 #6464

Daybreak-Zheng opened this issue Dec 27, 2024 · 1 comment
Labels
solved This problem has been already solved

Comments

@Daybreak-Zheng
Copy link

System Info
llamafactory version: 0.9.2.dev0
Platform: Linux-x86_64
Python version: 3.10.16
PyTorch version: 2.3.0+cu121 (GPU)
Transformers version: 4.46.1
Datasets version: 3.1.0
Accelerate version: 1.0.1
GPU type: NVIDIA GeForce RTX 4090
vLLM version: 0.5.0.post1

运行vllm时报错,非常期待您的回复。

Clipboard_Screenshot_1735308869

@github-actions github-actions bot added the pending This problem is yet to be addressed label Dec 27, 2024
@MrLYG
Copy link

MrLYG commented Dec 29, 2024

try install vllm==0.6.3.post1

@hiyouga hiyouga added solved This problem has been already solved and removed pending This problem is yet to be addressed labels Dec 30, 2024
@hiyouga hiyouga closed this as completed Dec 30, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
solved This problem has been already solved
Projects
None yet
Development

No branches or pull requests

3 participants