You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
If I turn automatic mixed precision off, the code breaks when passing the encoded_text into txt_proj. To fix it, I think I just need to change this line
Hi there,
Thank you for your interest in our work! We sincerely apologize for our delayed response.
We tried to reproduce your issue as described but couldn't. Can you please elaborate more on the issue? what was the running command? what was the exact error message?
Thanks
Hi there,
No worries at all. If I train the model with this command python main.py -rm train -c configs/refer_youtube_vos.yaml -ws 8 -bs 1 -ng 8 and I set enable_amp: false with freeze_text_encoder: true, I get the error Inference tensors cannot be saved for backward. To work around you can make a clone to get a normal tensor and use it in autograd.
I believe it has to do with the inference_mode setting. But this error only occurs with automatic mixed precision turned off oddly. To fix the issue I had to clone the tensor as above.
If I turn automatic mixed precision off, the code breaks when passing the encoded_text into txt_proj. To fix it, I think I just need to change this line
MTTR/models/multimodal_transformer.py
Line 89 in 49d58c5
txt_memory = rearrange(encoded_text.last_hidden_state.clone(), 'b s c -> s b c')
The text was updated successfully, but these errors were encountered: