emanuelaboros commited on
Commit
7c1adc2
·
1 Parent(s): 24af366

review readme

Browse files
Files changed (1) hide show
  1. README.md +7 -12
README.md CHANGED
@@ -129,19 +129,14 @@ This model was adapted for historical texts and fine-tuned on the [HIPE-2022 dat
129
 
130
  ### Model Description
131
 
132
- - **Developed by:** [Impresso team](https://impresso-project.ch/). [Impresso - Media Monitoring of the Past](https://impresso-project.ch) is an
133
- interdisciplinary research project that aims to develop and consolidate tools for
134
- processing and exploring large collections of media archives across modalities, time,
135
- languages and national borders. The first project (2017-2021) was funded by the Swiss
136
- National Science Foundation under grant
137
- No. [CRSII5_173719](http://p3.snf.ch/project-173719) and the second project (2023-2027)
138
- by the SNSF under grant No. [CRSII5_213585](https://data.snf.ch/grants/grant/213585)
139
- and the Luxembourg National Research Fund under grant No. 17498891.
140
- - **Model type:** Stacked BERT-based token classification model for named entity recognition
141
- - **Languages supported:** multilingual (over 100 languages, optimized for fr, de, en)
142
- - **License:** [GNU Affero General Public License v3 or later](https://github.com/impresso/impresso-pyindexation/blob/master/LICENSE)
143
- - **Finetuned from model:** [dbmdz/bert-medium-historic-multilingual-cased](https://huggingface.co/dbmdz/bert-medium-historic-multilingual-cased)
144
 
 
 
 
 
 
 
145
  ### Model Architecture
146
 
147
  - **Architecture:** mBART-based seq2seq with constrained beam search
 
129
 
130
  ### Model Description
131
 
132
+ ### Model Description
 
 
 
 
 
 
 
 
 
 
 
133
 
134
+ - **Developed by:** [Impresso team](https://impresso-project.ch), an interdisciplinary project for large-scale media archive analysis across time, language, and modality. Funded by the Swiss National Science Foundation ([CRSII5_173719](http://p3.snf.ch/project-173719), [CRSII5_213585](https://data.snf.ch/grants/grant/213585)) and the Luxembourg National Research Fund (grant No. 17498891).
135
+ - **Model type:** mBART-based sequence-to-sequence model with constrained beam search for named entity linking
136
+ - **Languages:** Multilingual (100+ languages, optimized for French, German, and English)
137
+ - **License:** [AGPL v3+](https://github.com/impresso/impresso-pyindexation/blob/master/LICENSE)
138
+ - **Finetuned from:** [`facebook/mgenre-wiki`](https://huggingface.co/facebook/mgenre-wiki)
139
+ -
140
  ### Model Architecture
141
 
142
  - **Architecture:** mBART-based seq2seq with constrained beam search