ALBERT for Math AR
This model is further pre-trained on the Mathematics StackExchange questions and answers. It is based on Albert base v2 and uses the same tokenizer. In addition to pre-training the model was finetuned on Math Question Answer Retrieval. The sequence classification head is trained to output a relevance score if you input the question as the first segment and the answer as the second segment. You can use the relevance score to rank different answers for retrieval.
Usage
from transformers import AutoTokenizer, AutoModelForSequenceClassification
tokenizer = AutoTokenizer.from_pretrained("albert-base-v2")
model = AutoModelForSequenceClassification.from_pretrained("AnReu/albert-for-math-ar-base-ft")
Reference
If you use this model, please consider referencing our paper:
@inproceedings{reusch2021tu_dbs,
title={TU\_DBS in the ARQMath Lab 2021, CLEF},
author={Reusch, Anja and Thiele, Maik and Lehner, Wolfgang},
year={2021},
organization={CLEF}
}