timm/vit_base_patch16_clip_224.laion2b_ft_in1k Image Classification • 0.1B • Updated Jan 21 • 195 • 1
timm/vit_base_patch16_clip_224.openai_ft_in1k Image Classification • 0.1B • Updated Jan 21 • 5.12k • 1
timm/vit_base_patch16_clip_384.laion2b_ft_in1k Image Classification • 0.1B • Updated Jan 21 • 167 • 5
timm/vit_base_patch32_clip_384.openai_ft_in12k_in1k Image Classification • 0.1B • Updated Jan 21 • 86
timm/vit_base_patch16_clip_384.laion2b_ft_in12k_in1k Image Classification • 0.1B • Updated Jan 21 • 1.12k • 4
timm/vit_large_patch14_clip_336.openai_ft_in12k_in1k Image Classification • 0.3B • Updated Jan 21 • 133 • 1
timm/vit_base_patch16_clip_224.laion2b_ft_in12k_in1k Image Classification • 0.1B • Updated Jan 21 • 2.24k • 2
timm/vit_base_patch16_clip_224.openai_ft_in12k_in1k Image Classification • 0.1B • Updated Jan 21 • 586
timm/vit_base_patch16_clip_384.openai_ft_in12k_in1k Image Classification • 0.1B • Updated Jan 21 • 36 • 1