timm/eva_large_patch14_196.in22k_ft_in22k_in1k Image Classification • 0.3B • Updated Jan 21 • 18.5k • 2
timm/beitv2_large_patch16_224.in1k_ft_in22k_in1k Image Classification • 0.3B • Updated Jan 21 • 1.39k • 2
timm/vit_large_patch14_clip_224.openai_ft_in12k_in1k Image Classification • 0.3B • Updated Jan 21 • 1.25k • 38
timm/convnext_large_mlp.clip_laion2b_soup_ft_in12k_in1k_384 Image Classification • 0.2B • Updated Jan 21 • 3.3k • 3
timm/eva02_base_patch14_448.mim_in22k_ft_in22k_in1k Image Classification • 0.1B • Updated Jan 21 • 7.25k • 5
timm/eva02_base_patch14_448.mim_in22k_ft_in1k Image Classification • 0.1B • Updated Jan 21 • 1.25k • 2
timm/vit_huge_patch14_clip_224.laion2b_ft_in12k_in1k Image Classification • 0.6B • Updated Jan 21 • 1.61k • 2
timm/eva_large_patch14_336.in22k_ft_in22k_in1k Image Classification • 0.3B • Updated Jan 21 • 2.46k • 1
timm/vit_large_patch14_clip_336.openai_ft_in12k_in1k Image Classification • 0.3B • Updated Jan 21 • 114 • 1
timm/vit_large_patch14_clip_336.laion2b_ft_in12k_in1k Image Classification • 0.3B • Updated Jan 21 • 313
timm/convnext_xxlarge.clip_laion2b_soup_ft_in1k Image Classification • 0.8B • Updated Jan 21 • 3.5k • 2