Transformers
PyTorch
YAML Metadata Warning: empty or missing yaml metadata in repo card (https://huggingface.co/docs/hub/model-cards#model-card-metadata)

deit_tiny_patch16_224

Implementation of DeiT proposed in Training data-efficient image transformers & distillation through attention

An attention based distillation is proposed where a new token is added to the model, the [dist]{.title-ref} token.

image

DeiT.deit_tiny_patch16_224()
DeiT.deit_small_patch16_224()
DeiT.deit_base_patch16_224()
DeiT.deit_base_patch16_384()
Downloads last month
41
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support