You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Currently, when I check phi-4-mini model with TGI and also phi-4-multimodal model, none of them can be loaded with TGI version 3.1.1. Phi-4-mini need transformer version 4.49.0 and phi-4-multimodal got the following error:
Unsupported model type phi4mm
Is there any possibility to support these two models?
Also is it possible to use TensorRT-LLM backend?
Motivation
Resolve the issues with phi-4-mini and phi-4-multimodal model to be deployed with TGI
Your contribution
Upgrade transformer to version 4.49.0 and add support for multimodal models
The text was updated successfully, but these errors were encountered:
Feature request
Currently, when I check phi-4-mini model with TGI and also phi-4-multimodal model, none of them can be loaded with TGI version 3.1.1. Phi-4-mini need transformer version 4.49.0 and phi-4-multimodal got the following error:
Is there any possibility to support these two models?
Also is it possible to use TensorRT-LLM backend?
Motivation
Resolve the issues with phi-4-mini and phi-4-multimodal model to be deployed with TGI
Your contribution
Upgrade transformer to version 4.49.0 and add support for multimodal models
The text was updated successfully, but these errors were encountered: