Skip to content

Are all models without the suffix _in21k trained on ImageNet-1k? #583

Answered by rwightman
JosephPai asked this question in Q&A
Discussion options

You must be logged in to vote

@JosephPai it's a bit more complex than that... vit_deit_base is fine, and I'm pretty sure repvgg is okay (weights came from repvgg official impl).

Models with _in21k or _in22k were trained on 21k and still have the 21k classifier. There are a number of models that were pretraied on in21k and then fine-tuned on in1k and they do not have 21k in the name. There is usually a comment somewhere in the model file though.

There are some other models that were pretrained in semi-supervised fashion with larger datasets w/ in1k as the target. Models like

  • EfficientNet-NoisyStudent (_ns)
  • ig_resnext_* Instagram hashtag pretrain
  • SSL/SWSL ResNeXT (ssl_*, swsl_*)

Replies: 1 comment

Comment options

You must be logged in to vote
0 replies
Answer selected by JosephPai
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Category
Q&A
Labels
None yet
2 participants