coronaBERT: DistilBERT-Based Model for COVID-19 Policy classification
Overview
- This repository contains code, training scripts, and documentation for a
fine-tuned DistilBERT
model for national policy related to COVID-19 classification tasks. The model is designed to classify text descriptions of COVID policies into predefined categories (Policy Type). The project includes data preprocessing, model training, evaluation, and quantization for efficient inference.
Feature
- Fine-tuned on custom labeled data for sequence classification
- Supports dynamic quantization for reduced model size and faster inference
- End-to-end pipeline: data preparation, training, evaluation, and model export
- Ready-to-use with Hugging Face Transformers
Model detail
- Base model:
distilbert-base-uncased
- Data: CoronaNet Research Project
- Task: Sequence Classification (e.g., predicting type)
- Framework: PyTorch, Transformers
- Quantization: Optional INT8 dynamic quantization for deployment
Citation
@misc{joesh1-coronaBERT,
author = {Joesh1},
title = {coronaBERT: DistilBERT-Based Model for COVID-19 Policy Classification},
howpublished = {\url{https://huggingface.co/Joesh1/coronaBERT}},
year = {2025},
note = {Accessed: [Insert Date Accessed, e.g., 2025-05-22]}
}
- Downloads last month
- 3
Model tree for Joesh1/coronaBERT
Base model
distilbert/distilbert-base-uncased