Hugging Face's logo Hugging Face
  • Models
  • Datasets
  • Spaces
  • Posts
  • Docs
  • Enterprise
  • Pricing

  • Log In
  • Sign Up

saattrupdan
/
nbailab-base-ner-scandi

Token Classification
Transformers
PyTorch
Safetensors
bert
Model card Files Files and versions Community
6
nbailab-base-ner-scandi
Ctrl+K
Ctrl+K
  • 2 contributors
History: 44 commits
saattrupdan's picture
saattrupdan
Update README.md
1a2f925 verified 5 months ago
  • .gitattributes
    1.23 kB
    Adding `safetensors` variant of this model (#2) almost 2 years ago
  • .gitignore
    13 Bytes
    add model over 3 years ago
  • README.md
    6.14 kB
    Update README.md 5 months ago
  • config.json
    1.21 kB
    add model over 3 years ago
  • model.safetensors
    709 MB
    LFS
    Adding `safetensors` variant of this model (#2) almost 2 years ago
  • pytorch_model.bin
    709 MB
    LFS
    add model over 3 years ago
  • special_tokens_map.json
    112 Bytes
    add model over 3 years ago
  • tokenizer.json
    1.96 MB
    add model over 3 years ago
  • tokenizer_config.json
    373 Bytes
    fix: Add model_max_length to tokeniser config over 3 years ago
  • training_args.bin

    Detected Pickle imports (4)

    • "transformers.trainer_utils.SchedulerType",
    • "transformers.training_args.TrainingArguments",
    • "torch.device",
    • "transformers.trainer_utils.IntervalStrategy"

    How to fix it?

    2.67 kB
    LFS
    add model over 3 years ago
  • vocab.txt
    996 kB
    add model over 3 years ago