Improve model card: Add paper link, correct pipeline tag

#4
by nielsr HF Staff - opened
Files changed (1) hide show
  1. README.md +6 -4
README.md CHANGED
@@ -1,12 +1,14 @@
1
  ---
2
- license: apache-2.0
3
  language:
4
  - pl
5
  library_name: transformers
 
 
6
  inference:
7
  parameters:
8
  temperature: 0.9
9
- extra_gated_description: If you want to learn more about how you can use the model, please refer to our <a href="https://bielik.ai/terms/">Terms of Use</a>.
 
10
  ---
11
 
12
  <p align="center">
@@ -15,7 +17,7 @@ extra_gated_description: If you want to learn more about how you can use the mod
15
 
16
  # Bielik-11B-v2
17
 
18
- Bielik-11B-v2 is a generative text model featuring 11 billion parameters. It is initialized from its predecessor, Mistral-7B-v0.2, and trained on 400 billion tokens.
19
  The aforementioned model stands as a testament to the unique collaboration between the open-science/open-source project SpeakLeash and the High Performance Computing (HPC) center: ACK Cyfronet AGH.
20
  Developed and trained on Polish text corpora, which have been cherry-picked and processed by the SpeakLeash team, this endeavor leverages Polish large-scale computing infrastructure, specifically within the PLGrid environment,
21
  and more precisely, the HPC center: ACK Cyfronet AGH. The creation and training of the Bielik-11B-v2 was propelled by the support of computational grant number PLG/2024/016951, conducted on the Athena and Helios supercomputer,
@@ -201,4 +203,4 @@ Members of the ACK Cyfronet AGH team providing valuable support and expertise:
201
 
202
  ## Contact Us
203
 
204
- If you have any questions or suggestions, please use the discussion tab. If you want to contact us directly, join our [Discord SpeakLeash](https://discord.gg/pv4brQMDTy).
 
1
  ---
 
2
  language:
3
  - pl
4
  library_name: transformers
5
+ license: apache-2.0
6
+ pipeline_tag: text-generation
7
  inference:
8
  parameters:
9
  temperature: 0.9
10
+ extra_gated_description: If you want to learn more about how you can use the model,
11
+ please refer to our <a href="https://bielik.ai/terms/">Terms of Use</a>.
12
  ---
13
 
14
  <p align="center">
 
17
 
18
  # Bielik-11B-v2
19
 
20
+ This repository hosts the model card for the model presented in the paper [Bielik 11B v2 Technical Report](https://huggingface.co/papers/2505.02410). Bielik-11B-v2 is a generative text model featuring 11 billion parameters. It is initialized from its predecessor, Mistral-7B-v0.2, and trained on 400 billion tokens.
21
  The aforementioned model stands as a testament to the unique collaboration between the open-science/open-source project SpeakLeash and the High Performance Computing (HPC) center: ACK Cyfronet AGH.
22
  Developed and trained on Polish text corpora, which have been cherry-picked and processed by the SpeakLeash team, this endeavor leverages Polish large-scale computing infrastructure, specifically within the PLGrid environment,
23
  and more precisely, the HPC center: ACK Cyfronet AGH. The creation and training of the Bielik-11B-v2 was propelled by the support of computational grant number PLG/2024/016951, conducted on the Athena and Helios supercomputer,
 
203
 
204
  ## Contact Us
205
 
206
+ If you have any questions or suggestions, please use the discussion tab. If you want to contact us directly, join our [Discord SpeakLeash](https://discord.gg/pv4brQMDTy).