Model Card for Model ID
Distilbert tokenizer trained on KazQAD
Model Details
Model Description
- Model type: DistilBERT
- Language(s) (NLP): Kazakh
Training Details
Training Data
https://github.com/IS2AI/KazQAD/
Environmental Impact
- Hardware Type: TPUv2
- Hours used: Less than a minute
- Cloud Provider: Google Colab
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
🙋
Ask for provider support
HF Inference deployability: The model has no pipeline_tag.