Edit model card
YAML Metadata Warning: empty or missing yaml metadata in repo card (https://huggingface.co/docs/hub/model-cards#model-card-metadata)

Quantization made by Richard Erkhov.

Github

Discord

Request more models

gpt2-medium-italian-embeddings - GGUF

Original model description:

language: it tags: - adaption - recycled - gpt2-medium pipeline_tag: text-generation

GPT-2 recycled for Italian (medium, adapted lexical embeddings)

Wietse de Vries • Malvina Nissim

Model description

This model is based on the medium OpenAI GPT-2 (gpt2-medium) model.

The Transformer layer weights in this model are identical to the original English, model but the lexical layer has been retrained for an Italian vocabulary.

For details, check out our paper on arXiv and the code on Github.

Related models

Dutch

Italian

How to use

from transformers import pipeline

pipe = pipeline("text-generation", model="GroNLP/gpt2-medium-italian-embeddings")
from transformers import AutoTokenizer, AutoModel, TFAutoModel

tokenizer = AutoTokenizer.from_pretrained("GroNLP/gpt2-medium-italian-embeddings")
model = AutoModel.from_pretrained("GroNLP/gpt2-medium-italian-embeddings")  # PyTorch
model = TFAutoModel.from_pretrained("GroNLP/gpt2-medium-italian-embeddings")  # Tensorflow

BibTeX entry

@misc{devries2020good,
      title={As good as new. How to successfully recycle English GPT-2 to make models for other languages}, 
      author={Wietse de Vries and Malvina Nissim},
      year={2020},
      eprint={2012.05628},
      archivePrefix={arXiv},
      primaryClass={cs.CL}
}
Downloads last month
167
GGUF
Model size
365M params
Architecture
gpt2

2-bit

3-bit

4-bit

5-bit

6-bit

Inference API
Unable to determine this model's library. Check the docs .