Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

How to change the max length? #10

Open
MonolithFoundation opened this issue Jul 9, 2024 · 2 comments
Open

How to change the max length? #10

MonolithFoundation opened this issue Jul 9, 2024 · 2 comments

Comments

@MonolithFoundation
Copy link

I found when data max length more than 1024, I have to cut otherwise the forward would throw an exception....

@andimarafioti
Copy link
Owner

The model was trained with 1024 as a maximum input length. You could try to fine tune it for longer sequences, and there is a vast literature on the subject, but honestly if you need a larger input length, Florence2 is probably not the right model for you. Maybe look into idefics2: https://huggingface.co/docs/transformers/main/en/model_doc/idefics2

@MonolithFoundation
Copy link
Author

MonolithFoundation commented Jul 10, 2024 via email

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants