DistilBERT release Original DistilBERT model, checkpoints obtained from using teacher-student learning from the original BERT checkpoints. distilbert/distilbert-base-cased Fill-Mask • 0.1B • Updated May 6, 2024 • 142k • • 51 distilbert/distilbert-base-uncased Fill-Mask • 0.1B • Updated May 6, 2024 • 10.5M • • 733 distilbert/distilbert-base-multilingual-cased Fill-Mask • 0.1B • Updated May 6, 2024 • 797k • 203 distilbert/distilbert-base-uncased-finetuned-sst-2-english Text Classification • 0.1B • Updated Dec 19, 2023 • 3.12M • • 800
distilbert/distilbert-base-uncased-finetuned-sst-2-english Text Classification • 0.1B • Updated Dec 19, 2023 • 3.12M • • 800
DistilBERT release Original DistilBERT model, checkpoints obtained from using teacher-student learning from the original BERT checkpoints. distilbert/distilbert-base-cased Fill-Mask • 0.1B • Updated May 6, 2024 • 142k • • 51 distilbert/distilbert-base-uncased Fill-Mask • 0.1B • Updated May 6, 2024 • 10.5M • • 733 distilbert/distilbert-base-multilingual-cased Fill-Mask • 0.1B • Updated May 6, 2024 • 797k • 203 distilbert/distilbert-base-uncased-finetuned-sst-2-english Text Classification • 0.1B • Updated Dec 19, 2023 • 3.12M • • 800
distilbert/distilbert-base-uncased-finetuned-sst-2-english Text Classification • 0.1B • Updated Dec 19, 2023 • 3.12M • • 800