IMISLab commited on
Commit
53fc4b1
·
verified ·
1 Parent(s): ce1ce03

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +5 -5
README.md CHANGED
@@ -12,7 +12,7 @@ widget:
12
  example_title: 'Text'
13
 
14
  model-index:
15
- - name: IMISLab/GreekT5-umt5-base-greekwikipedia
16
  results:
17
  - task:
18
  type: summarization
@@ -41,10 +41,10 @@ model-index:
41
  verified: true
42
  ---
43
 
44
- # GreekT5 (umt5-base-greekwikipedia)
45
 
46
  A Greek encyclopedic article summarization model trained and evaluated on [GreekWikipedia](https://huggingface.co/datasets/IMISLab/GreekWikipedia/).
47
- This model was trained as part of our research paper:
48
  [Giarelis, N., Mastrokostas, C., & Karacapilidis, N. (2024) Greek Wikipedia: A Study on Abstractive Summarization]()
49
  For more information see the evaluation section below.
50
 
@@ -52,7 +52,7 @@ For more information see the evaluation section below.
52
 
53
  ## Training dataset
54
 
55
- The training dataset of `GreekT5-umt5-base-greekwikipedia` is [GreekWikipedia](), which is the first encyclopedic summarization dataset for the Greek Language.
56
  This dataset contains 93,433 articles collected from the Greek part of [Wikipedia](https://el.wikipedia.org/).
57
 
58
  ## Training configuration
@@ -83,7 +83,7 @@ LEAD|18.51|3.18|11.48|65.77
83
  ```python
84
  from transformers import AutoModelForSeq2SeqLM, AutoTokenizer, pipeline
85
 
86
- model_name = 'IMISLab/GreekT5-umt5-base-greekwikipedia'
87
  model = AutoModelForSeq2SeqLM.from_pretrained(model_name)
88
  tokenizer = AutoTokenizer.from_pretrained(model_name)
89
 
 
12
  example_title: 'Text'
13
 
14
  model-index:
15
+ - name: IMISLab/GreekWiki-umt5-base
16
  results:
17
  - task:
18
  type: summarization
 
41
  verified: true
42
  ---
43
 
44
+ # GreekWiki (umt5-base)
45
 
46
  A Greek encyclopedic article summarization model trained and evaluated on [GreekWikipedia](https://huggingface.co/datasets/IMISLab/GreekWikipedia/).
47
+ This model was trained as part of our upcoming research paper:
48
  [Giarelis, N., Mastrokostas, C., & Karacapilidis, N. (2024) Greek Wikipedia: A Study on Abstractive Summarization]()
49
  For more information see the evaluation section below.
50
 
 
52
 
53
  ## Training dataset
54
 
55
+ The training dataset of `GreekWiki (umt5-base)` is [GreekWikipedia](https://huggingface.co/datasets/IMISLab/GreekWikipedia/), which is the first encyclopedic summarization dataset for the Greek Language.
56
  This dataset contains 93,433 articles collected from the Greek part of [Wikipedia](https://el.wikipedia.org/).
57
 
58
  ## Training configuration
 
83
  ```python
84
  from transformers import AutoModelForSeq2SeqLM, AutoTokenizer, pipeline
85
 
86
+ model_name = 'IMISLab/GreekWiki-umt5-base'
87
  model = AutoModelForSeq2SeqLM.from_pretrained(model_name)
88
  tokenizer = AutoTokenizer.from_pretrained(model_name)
89