Add link to GitHub repository
Browse filesThis PR improves the model card by adding an explicit link to the GitHub repository, making it easier for users to find the source code.
README.md
CHANGED
@@ -1,7 +1,7 @@
|
|
1 |
---
|
2 |
-
pipeline_tag: feature-extraction
|
3 |
library_name: transformers
|
4 |
license: apache-2.0
|
|
|
5 |
---
|
6 |
|
7 |
# Overview
|
@@ -9,6 +9,7 @@ license: apache-2.0
|
|
9 |
This repository contains an encoder model, part of the research presented in the paper *Should We Still Pretrain Encoders with Masked Language Modeling?* (Gisserot-Boukhlef et al.).
|
10 |
|
11 |
* **Paper:** [Should We Still Pretrain Encoders with Masked Language Modeling?](https://huggingface.co/papers/2507.00994)
|
|
|
12 |
* **Blog post:** [Link](https://huggingface.co/blog/Nicolas-BZRD/encoders-should-not-be-only-pre-trained-with-mlm)
|
13 |
* **Project page:** [https://hf.co/MLMvsCLM](https://hf.co/MLMvsCLM)
|
14 |
|
|
|
1 |
---
|
|
|
2 |
library_name: transformers
|
3 |
license: apache-2.0
|
4 |
+
pipeline_tag: feature-extraction
|
5 |
---
|
6 |
|
7 |
# Overview
|
|
|
9 |
This repository contains an encoder model, part of the research presented in the paper *Should We Still Pretrain Encoders with Masked Language Modeling?* (Gisserot-Boukhlef et al.).
|
10 |
|
11 |
* **Paper:** [Should We Still Pretrain Encoders with Masked Language Modeling?](https://huggingface.co/papers/2507.00994)
|
12 |
+
* **Code:** [https://github.com/Nicolas-BZRD/EuroBERT](https://github.com/Nicolas-BZRD/EuroBERT)
|
13 |
* **Blog post:** [Link](https://huggingface.co/blog/Nicolas-BZRD/encoders-should-not-be-only-pre-trained-with-mlm)
|
14 |
* **Project page:** [https://hf.co/MLMvsCLM](https://hf.co/MLMvsCLM)
|
15 |
|