Hugging Face
Models
Datasets
Spaces
Community
Docs
Enterprise
Pricing
Log In
Sign Up
Datasets:
r-three
/
tokenizer_robustness_exhaustive_english
like
0
Follow
r-three
14
Tasks:
Multiple Choice
Modalities:
Tabular
Text
Formats:
parquet
Size:
10K - 100K
Tags:
multilingual
tokenization
Libraries:
Datasets
pandas
Croissant
+ 1
License:
cc
Dataset card
Data Studio
Files
Files and versions
xet
Community
main
tokenizer_robustness_exhaustive_english
/
tokenizer_robustness_exhaustive_english_character_deletion
33.9 kB
1 contributor
History:
6 commits
gsaltintas
Uploading tokenizer_robustness_exhaustive_english_character_deletion subset
5e60ec6
verified
3 days ago
test-00000-of-00001.parquet
Safe
33.9 kB
xet
Uploading tokenizer_robustness_exhaustive_english_character_deletion subset
3 days ago