ARC-Encoder models

This page houses ARC8-Encoder_multi from three different versions of pretrained ARC-Encoders. Architectures and methods to train them are described in the paper ARC-Encoder: learning compressed text representations for large language models available here. A code to reproduce the pretraining, further fine-tune the encoders or even evaluate them on dowstream tasks is available at ARC-Encoder repository.

Models Details

All the encoders released here are trained on web crawl filtered using Dactory based on a Llama3.2-3B base backbone. It consists in two ARC-Encoder specifically trained for one decoder and one for two decoders in the same time:

  • ARC8-Encoder_Llama, trained on 2.6B tokens on Llama3.1-8B base specifically with a pooling factor of 8.
  • ARC8-Encoder_Mistral, trained on 2.6B tokens on Mistral-7B base specifically with a pooling factor of 8.
  • ARC8-Encoder_multi, trained by sampling among the two decoders with a pooling factor of 8.

Uses

As described in the paper, the pretrained ARC-Encoders can be fine-tuned to perform various downstream tasks. You can also adapt an ARC-Encoder to a new pooling factor (PF) by fine-tuning it on the desired PF. For optimal results, we recommend fine-tuning toward a lower PF than the one used during pretraining. To reproduce the results presented in the paper, you can use our released fine-tuning dataset, ARC_finetuning.

Licensing

ARC-Encoders are licensed under the CC-BY 4.0 license.

Terms of use: As the released models are pretrained from Llama3.2 3B backbone, ARC-Encoders are subject to the Llama Terms of Use found at Llama license.

Citations

If you use one of these models, please cite:

@techreport{pilchen2025arc_encoder,
 title={ARC-Encoder: learning compressed text representations for large language models},
 author={Pilchen, Hippolyte and Grave, Edouard and P{\'e}rez, Patrick},
 year={2025}
}
Downloads last month
49
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Collection including kyutai/ARC8_Encoder_multi