SetFit with avsolatorio/GIST-Embedding-v0

This is a SetFit model that can be used for Text Classification. This SetFit model uses avsolatorio/GIST-Embedding-v0 as the Sentence Transformer embedding model. A SetFitHead instance is used for classification.

The model has been trained using an efficient few-shot learning technique that involves:

  1. Fine-tuning a Sentence Transformer with contrastive learning.
  2. Training a classification head with features from the fine-tuned Sentence Transformer.

Model Details

Model Description

Model Sources

Evaluation

Metrics

Label Accuracy
all 0.6

Uses

Direct Use for Inference

First install the SetFit library:

pip install setfit

Then you can load this model and run inference.

from setfit import SetFitModel

# Download from the ๐Ÿค— Hub
model = SetFitModel.from_pretrained("AlexBayer/GIST_SetFit_HIPs_v1")
# Run inference
preds = model("occupied palestinian territory cold wave dec 2013 cold wave event lasted unknown announced heavy rain fall snow storm hit west bank gaza 10 december 2013 still affecting palestinian population west bank palestine heavy rain snow generated flood several part palestine thousand family evacuated house extreme weather condition also caused several death including baby gaza reported dead family home inundated ifrc 16 dec 2013 useful link ocha opt winter storm online system palestinian red crescent society occupied palestinian territory")

Training Details

Training Set Metrics

Training set Min Median Max
Word count 34 319.4125 2470

Training Hyperparameters

  • batch_size: (16, 2)
  • num_epochs: (1, 16)
  • max_steps: -1
  • sampling_strategy: undersampling
  • body_learning_rate: (3.318622110926711e-05, 3.5664318062183154e-05)
  • head_learning_rate: 0.025092743459786394
  • loss: CosineSimilarityLoss
  • distance_metric: cosine_distance
  • margin: 0.25
  • end_to_end: True
  • use_amp: True
  • warmup_proportion: 0.1
  • l2_weight: 0.05
  • max_length: 512
  • seed: 42
  • eval_max_steps: -1
  • load_best_model_at_end: False

Training Results

Epoch Step Training Loss Validation Loss
0.1534 25 0.2384 -
0.3067 50 0.1621 -
0.4601 75 0.1389 -
0.6135 100 0.1214 -
0.7669 125 0.1115 -
0.9202 150 0.0927 -

Framework Versions

  • Python: 3.11.12
  • SetFit: 1.1.2
  • Sentence Transformers: 3.4.1
  • Transformers: 4.51.3
  • PyTorch: 2.6.0+cu124
  • Datasets: 3.5.1
  • Tokenizers: 0.21.1

Citation

BibTeX

@article{https://doi.org/10.48550/arxiv.2209.11055,
    doi = {10.48550/ARXIV.2209.11055},
    url = {https://arxiv.org/abs/2209.11055},
    author = {Tunstall, Lewis and Reimers, Nils and Jo, Unso Eun Seo and Bates, Luke and Korat, Daniel and Wasserblat, Moshe and Pereg, Oren},
    keywords = {Computation and Language (cs.CL), FOS: Computer and information sciences, FOS: Computer and information sciences},
    title = {Efficient Few-Shot Learning Without Prompts},
    publisher = {arXiv},
    year = {2022},
    copyright = {Creative Commons Attribution 4.0 International}
}
Downloads last month
26
Safetensors
Model size
109M params
Tensor type
F32
ยท
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support

Model tree for AlexBayer/GIST_SetFit_HIPs_v1

Finetuned
(4)
this model

Evaluation results