-
Notifications
You must be signed in to change notification settings - Fork 136
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
All flan-t5 doesn't work for me #114
Comments
Please provide the reproduced steps. |
@byshiue I am seeing the same behaviors. My reproduced steps below: Conversion:
This model is then uploaded and consumed in another container. Inferencing container is using fastertransformer_backend based on triton:22.09.
Inferencing sample: Response: |
Anyone managed to find a workaround for this yet? I have got a similar issue as well for a |
Description
Reproduced Steps
The text was updated successfully, but these errors were encountered: