-
-
Notifications
You must be signed in to change notification settings - Fork 5.3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Bug]: Mistral's Pixtral error for vllm>=0.6.5 on 4 T4's #11865
Comments
v6.6 也有 |
same here |
same here: vllm==0.6.6.post1 |
Note that my OS is wolfi |
@youkaichao do you think this has anything to do with torch compile? |
yes, we moved away from
@jgen1 it seems triton is broken because it does not find a C compiler. |
@youkaichao But with the exact same setup, vllm 0.6.4 works - it does detect my C compiler. |
I think most environments do have a C compiler. |
I will try that @youkaichao, but wasn't a C compiler also required in 0.6.4? |
@jgen1 I think this is related. Maybe you can try torch>=2.5.1 |
@jgen1 previously we use Please let me know if installing a C compiler and specify it via env var |
Your current environment
The output of `python collect_env.py` NOT WORKING VERSIONS
The output of `python collect_env.py` THE WORKING VERSIONS
Model Input Dumps
err_execute_model_input_20250108-202622.zip
🐛 Describe the bug
I have 4 T4's (on an AWS g4dn.12xl) and I am trying to run Mistral's pixtral model. I have been able to run it no problems for the past couple of months, in docker, with the following parameters:
It continues to work with vllm 0.6.4. But, in my docker image, when I install vllm 0.6.5, or 0.6.6 or v0.6.6.post1, with the same parameters, I get the error in the attached file (too many characters to add here).
(note: I also have an output with
TORCH_LOGS: "+dynamo"
andTORCHDYNAMO_VERBOSE: 1
if helpful)Again, it is important to note that the exact same configuration works with vllm 0.6.4.
Here is a small part of the error, full trace in the attached log:
output.log
Before submitting a new issue...
The text was updated successfully, but these errors were encountered: