-
Hi rwightman, Thanks for the amazing repo! I'm trying to use timm to do a lot of experiments, and I'd like to make sure that are all models without the suffix e.g., for now I'm using Thank you! |
Beta Was this translation helpful? Give feedback.
Replies: 1 comment
-
@JosephPai it's a bit more complex than that... vit_deit_base is fine, and I'm pretty sure repvgg is okay (weights came from repvgg official impl). Models with There are some other models that were pretrained in semi-supervised fashion with larger datasets w/ in1k as the target. Models like
|
Beta Was this translation helpful? Give feedback.
@JosephPai it's a bit more complex than that... vit_deit_base is fine, and I'm pretty sure repvgg is okay (weights came from repvgg official impl).
Models with
_in21k
or_in22k
were trained on 21k and still have the 21k classifier. There are a number of models that were pretraied on in21k and then fine-tuned on in1k and they do not have 21k in the name. There is usually a comment somewhere in the model file though.There are some other models that were pretrained in semi-supervised fashion with larger datasets w/ in1k as the target. Models like
_ns
)ig_resnext_*
Instagram hashtag pretrainssl_*
,swsl_*
)