You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Thank you for providing the code for pretraining of SwinViT.
I want to start a pretraining on my own data and initialize the pretraining SSLHead model (not only SwinViT encoder) with the provided pretrained weights. However, there is a small difference between the weights provided and the SSLHead model.
Describe the bug
The provided weights have one layer (two keys) more than the SSLHead model.
Load pretrained weights into the SSLHead model: model.load_state_dict(pretrained_dict, strict=False)
Expected behavior
All keys of the pretrained dict should also be present in the SSLHead model. But two keys (norm.weight and norm.bias) are missing in SSLHead model.
Screenshots
Difference of keys between pretrained weights and SSLHead:
Pretrained dict has two unexpected keys:
Missing keys in red box:
Why do the pretrained weights have an extra norm layer?
The text was updated successfully, but these errors were encountered:
Thank you for providing the code for pretraining of SwinViT.
I want to start a pretraining on my own data and initialize the pretraining SSLHead model (not only SwinViT encoder) with the provided pretrained weights. However, there is a small difference between the weights provided and the SSLHead model.
Describe the bug
The provided weights have one layer (two keys) more than the SSLHead model.
To Reproduce
Steps to reproduce the behavior:
model.load_state_dict(pretrained_dict, strict=False)
Expected behavior
All keys of the pretrained dict should also be present in the SSLHead model. But two keys (norm.weight and norm.bias) are missing in SSLHead model.
Screenshots
Difference of keys between pretrained weights and SSLHead:
Pretrained dict has two unexpected keys:
Missing keys in red box:
Why do the pretrained weights have an extra norm layer?
The text was updated successfully, but these errors were encountered: