SetFit with TurkuNLP/bert-base-finnish-cased-v1

This is a SetFit model that can be used for Text Classification. More specifically, the model is meant for detecting checkable claims in sentences extracted from news articles, in Finnish. In this particular model, a checkable claim constitutes claims about Quantity, Prediction, Correlation/Causation, Laws/Rules of operation, Other Claims. Non-claims include claims about Personal experience, and Non-checkable/Non-claim-sentences.

This SetFit model uses TurkuNLP/bert-base-finnish-cased-v1 as the Sentence Transformer embedding model. A LogisticRegression instance is used for classification.

The model has been trained using an efficient few-shot learning technique that involves:

  1. Fine-tuning a Sentence Transformer with contrastive learning.
  2. Training a classification head with features from the fine-tuned Sentence Transformer.

Model Details

Model Description

Model Sources

Model Labels

Label Examples
1
  • 'Israelin asevoimat on vahvistanut iskeneensä Jabaliaan.'
  • 'Zaporižžjan ydinvoimala menetti hetkeksi yhteyden sähköverkkoon – Säteilyturvakeskus kertoo, miksi ydinonnettomuudelta vältyttiin'
  • 'Käytännössä ne ovat lohkoketjun pätkällä yksilöityjä videoklippejä NBA-pelaajien huippuhetkistä.'
0
  • 'Kiurun mukaan palveluseteliä voitaisiin käyttää jatkossakin.'
  • 'Epävarmuutta epidemian kulussa'
  • '– Olen kova koripallofani ja junnuna keräsin perinteisiä keräilykortteja.'

Evaluation

Metrics

Label Metric (Macro-F1, 10-fold-cross-validated)
all 0.74

Uses

Direct Use for Inference

First install the SetFit library:

pip install setfit

Then you can load this model and run inference.

from setfit import SetFitModel

# Download from the 🤗 Hub
model = SetFitModel.from_pretrained("HY-Aalto-DIME/FinnClaim-detect-FinBERT-cased-CF3")
# Run inference
preds = model("– Ei siihen laitoksen puolesta ole mitään estettä.")

Training Details

Training Set Metrics

Training set Min Median Max
Word count 1 11.2633 33
Label Training Sample Count
0 182
1 874

Training Hyperparameters

  • batch_size: (16, 16)
  • num_epochs: (4, 4)
  • max_steps: -1
  • sampling_strategy: oversampling
  • num_iterations: 6
  • body_learning_rate: (2e-05, 1e-05)
  • head_learning_rate: 0.01
  • loss: CosineSimilarityLoss
  • distance_metric: cosine_distance
  • margin: 0.25
  • end_to_end: False
  • use_amp: False
  • warmup_proportion: 0.1
  • seed: 42
  • eval_max_steps: -1
  • load_best_model_at_end: False

Framework Versions

  • Python: 3.11.9
  • SetFit: 1.0.3
  • Sentence Transformers: 3.2.0
  • Transformers: 4.44.0
  • PyTorch: 2.4.0+cu124
  • Datasets: 2.21.0
  • Tokenizers: 0.19.1

Citation

BibTeX

@article{,
    doi = {},
    url = {https},
    author = {},
    keywords = {},
    title = {},
    publisher = {},
    year = {},
    copyright = {}
}
Downloads last month
12
Safetensors
Model size
125M params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for HY-Aalto-DIME/FinnClaim-detect-FinBERT-cased-CF3

Finetuned
(4)
this model

Evaluation results