Purpose: Mobilizing
This model has been trained on Facebook posts by Norwegian cabinet ministers of the Solberg governments (2013-2021). It was used in Karlsen, Kolltveit and Solheim (2025). The posts were hand coded specifying different roles and purposes of the posts. Below, we recreate the table 1 from the paper showing the five roles and four purposes. The model included here identifies posts where the purpose is to Mobilize. The setfit models that identify the other roles and purposes are available here. In the paper, we use one model for each purpose and each role. Each post can accordingly be ascribed to more than one purpose or role.
Communicative purposes | ||||
---|---|---|---|---|
Communicative roles | Informing | Communication | Mobilizing | Branding |
Ministry head | ||||
Cabinet member | ||||
Party politician | ||||
Individual politician | ||||
Private person |
This is a SetFit model that can be used for Text Classification of Norwegian social media posts. It uses NbAiLab/nb-sbert-base as the Sentence Transformer embedding model. A LogisticRegression instance is used for classification.
It has been trained using an efficient few-shot learning technique that involves:
- Fine-tuning a Sentence Transformer with contrastive learning.
- Training a classification head with features from the fine-tuned Sentence Transformer.
Model Details
Model Description
- Model Type: SetFit
- Sentence Transformer body: NbAiLab/nb-sbert-base
- Classification head: a LogisticRegression instance
- Maximum Sequence Length: 75 tokens
- Number of Classes: 1
Language:
- Norwegian (Bokmål)
Model Sources
- Repository: SetFit on GitHub
- Paper: Efficient Few-Shot Learning Without Prompts
- Blogpost: SetFit: Efficient Few-Shot Learning Without Prompts
Uses
Direct Use for Inference
First install the SetFit library:
pip install setfit
Then you can load this model and run inference.
from setfit import SetFitModel
# Download from the 🤗 Hub
model = SetFitModel.from_pretrained("oyvindbs/setfit_minister_nb-sbert-base_Ministry-Head")
# Run inference
preds = model("I loved the spiderman movie!")
Training Details
Framework Versions
- Python: 3.10.4
- SetFit: 1.1.1
- Sentence Transformers: 3.4.1
- Transformers: 4.50.1
- PyTorch: 2.5.1+cu118
- Datasets: 2.19.0
- Tokenizers: 0.21.0
Citation
@article{KarlsenKolltveitSolheim,
author = {Karlsen, Rune and Kolltveit, Kristoffer and Solheim, Øyvind Bugge},
title = {Balancing Acts: The communicative roles of cabinet ministers on social media},
publisher = {Media and Communication},
year = {2025},
volume = {13},
doi = {https://doi.org/10.17645/mac.10416}
}
BibTeX
@article{https://doi.org/10.48550/arxiv.2209.11055,
doi = {10.48550/ARXIV.2209.11055},
url = {https://arxiv.org/abs/2209.11055},
author = {Tunstall, Lewis and Reimers, Nils and Jo, Unso Eun Seo and Bates, Luke and Korat, Daniel and Wasserblat, Moshe and Pereg, Oren},
keywords = {Computation and Language (cs.CL), FOS: Computer and information sciences, FOS: Computer and information sciences},
title = {Efficient Few-Shot Learning Without Prompts},
publisher = {arXiv},
year = {2022},
copyright = {Creative Commons Attribution 4.0 International}
}
- Downloads last month
- 9
Model tree for oyvindbs/setfit-minister-mobilize-nb-sbert-base
Base model
NbAiLab/nb-sbert-base