CLIP-Spanish
CLIP Spanish is a CLIP-like model for Spanish language. It is composed of BERTIN as a language encoder and the ViT-B/32 image encoder from CLIP. The model is implemented in Flax, including training scripts (see training.md
).
This is part of the Flax/Jax Community Week, organised by HuggingFace and TPU usage sponsored by Google.
Spanish WIT
We used a subset of 141,230 Spanish captions from the WIT dataset for training.
Team members
- Eduardo González Ponferrada (edugp)
- Manu Romero (mrm8488)
- María Grandury (mariagrandury)
Useful links
- Downloads last month
- 25