You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
The model was trained with 1024 as a maximum input length. You could try to fine tune it for longer sequences, and there is a vast literature on the subject, but honestly if you need a larger input length, Florence2 is probably not the right model for you. Maybe look into idefics2: https://huggingface.co/docs/transformers/main/en/model_doc/idefics2
---- Replied Message ----
| From | Andrés ***@***.***> |
| Date | 07/10/2024 00:40 |
| To | ***@***.***> |
| Cc | ***@***.***>***@***.***> |
| Subject | Re: [andimarafioti/florence2-finetuning] How to change the max length? (Issue #10) |
The model was trained with 1024 as a maximum input length. You could try to fine tune it for longer sequences, and there is a vast literature on the subject, but honestly if you need a larger input length, Florence2 is probably not the right model for you. Maybe look into idefics2: https://huggingface.co/docs/transformers/main/en/model_doc/idefics2
—
Reply to this email directly, view it on GitHub, or unsubscribe.
You are receiving this because you authored the thread.Message ID: ***@***.***>
I found when data max length more than 1024, I have to cut otherwise the forward would throw an exception....
The text was updated successfully, but these errors were encountered: