DistilBERT release Original DistilBERT model, checkpoints obtained from using teacher-student learning from the original BERT checkpoints. distilbert/distilbert-base-cased Fill-Mask • 65.8M • Updated May 6, 2024 • 119k • • 53 distilbert/distilbert-base-uncased Fill-Mask • 67M • Updated May 6, 2024 • 13.3M • • 779 distilbert/distilbert-base-multilingual-cased Fill-Mask • 0.1B • Updated May 6, 2024 • 870k • • 220 distilbert/distilbert-base-uncased-finetuned-sst-2-english Text Classification • 67M • Updated Dec 19, 2023 • 5.2M • • 837
distilbert/distilbert-base-uncased-finetuned-sst-2-english Text Classification • 67M • Updated Dec 19, 2023 • 5.2M • • 837
DistilBERT release Original DistilBERT model, checkpoints obtained from using teacher-student learning from the original BERT checkpoints. distilbert/distilbert-base-cased Fill-Mask • 65.8M • Updated May 6, 2024 • 119k • • 53 distilbert/distilbert-base-uncased Fill-Mask • 67M • Updated May 6, 2024 • 13.3M • • 779 distilbert/distilbert-base-multilingual-cased Fill-Mask • 0.1B • Updated May 6, 2024 • 870k • • 220 distilbert/distilbert-base-uncased-finetuned-sst-2-english Text Classification • 67M • Updated Dec 19, 2023 • 5.2M • • 837
distilbert/distilbert-base-uncased-finetuned-sst-2-english Text Classification • 67M • Updated Dec 19, 2023 • 5.2M • • 837