DistilGPT2 English language model fine-tuned on mathematical proofs extracted from arXiv.org LaTeX sources from 1992 to 2020.

Proofs have been cleaned up a bit. In particular, they use

  • CITE for any citation
  • REF for any reference
  • MATH for any LaTeX mathematical formula
  • CASE: for any \item or labeled subcase.
Downloads last month
13
Safetensors
Model size
88.2M params
Tensor type
F32
·
U8
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for christopherastone/distilgpt2-proofs

Quantizations
2 models