TESS 2 v0.1 Base

This model is the diffusion adapted TESS 2. This model is a simplex-based diffusion model adapted from Mistral v0.1 7B, further trained on Dolma 1.7. For more details, please check out our paper TESS-2: A Large-Scale, Generalist Diffusion Language Model. This is the model based on Mistral v0.1.

This is the diffusion-adapted base model, which has not yet undergone instruction tuning. We recommend further tuning this model on your dataset of interest, or checking out the instruction tuned version.

This model will only work with our custom codebase found here -- please go there to see details on how to run training.

Citation

If you find this work useful, please cite this work as follows.

@misc{taeivison2025tess2,
  title={{TESS 2: A Large-Scale Generalist Diffusion Language Model}},
  author={Jaesung Tae and Hamish Ivison and Sachin Kumar and Arman Cohan},
  year={2025},
  eprint={2502.13917},
  archivePrefix={arXiv},
  primaryClass={cs.CL},
  url={https://arxiv.org/abs/2502.13917},
 }
Downloads last month
6
Safetensors
Model size
7.24B params
Tensor type
BF16
·
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.
The model cannot be deployed to the HF Inference API: The model has no library tag.

Model tree for hamishivi/tess2-v0.1-base

Finetuned
(845)
this model
Finetunes
2 models

Dataset used to train hamishivi/tess2-v0.1-base

Collection including hamishivi/tess2-v0.1-base