Update README.md
Browse files
README.md
CHANGED
@@ -10,7 +10,7 @@ tags:
|
|
10 |
|
11 |
# DisorRoBERTa
|
12 |
|
13 |
-
DisorRoBERTa is a double-domain adaptation of a RoBERTa language model
|
14 |
|
15 |
We follow the standard procedure for fine-tuning a masked language model in [Huggingface’s NLP Course](https://huggingface.co/learn/nlp-course/chapter7/3?fw=pt) 🤗.
|
16 |
|
|
|
10 |
|
11 |
# DisorRoBERTa
|
12 |
|
13 |
+
DisorRoBERTa is a double-domain adaptation of a RoBERTa language model(a variation of [DisorBERT](https://aclanthology.org/2023.acl-long.853/)). First, is adapted to social media language, and then, adapted to the mental health domain. In both steps, it incorporated a lexical resource to guide the masking process of the language model and, therefore, to help it in paying more attention to words related to mental disorders.
|
14 |
|
15 |
We follow the standard procedure for fine-tuning a masked language model in [Huggingface’s NLP Course](https://huggingface.co/learn/nlp-course/chapter7/3?fw=pt) 🤗.
|
16 |
|