slippylolo commited on
Commit
831952d
1 Parent(s): 1a8b638

Update model card

Browse files
Files changed (1) hide show
  1. README.md +132 -12
README.md CHANGED
@@ -9,27 +9,57 @@ language:
9
 
10
  **Falcon-RW-1B is a 1B parameters causal decoder-only model built by [TII](https://www.tii.ae) and trained on 350B tokens of [RefinedWeb](https://huggingface.co/datasets/tiiuae/falcon-refinedweb). It is made available under the [TII Falcon LLM License](https://huggingface.co/tiiuae/falcon-rw-1b/blob/main/LICENSE.txt).**
11
 
 
 
12
  RefinedWeb is a high-quality web dataset built by leveraging stringent filtering and large-scale deduplication. Falcon-RW-1B, trained on RefinedWeb only, matches or outperforms comparable models trained on curated data.
13
 
14
- This model is intended for use as a research artifact, to study the influence of training on appropriately filtered web data alone.
15
 
 
 
 
 
16
 
17
- # Model Card for Falcon-RW-1B
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
18
 
 
 
 
 
 
19
 
20
  ## Model Details
21
 
22
  ### Model Description
23
 
24
- - **Developed by:** [https://www.tii.ae](https://www.tii.ae)
25
- - **Model type:** Causal decoder-only
26
- - **Language(s) (NLP):** English
27
- - **License:** TII Falcon LLM License
28
 
29
  ### Model Source
30
 
31
- - **Paper:** coming soon
32
- - **Demo:** coming soon
33
 
34
  ## Uses
35
 
@@ -39,13 +69,103 @@ Research on large language models, and the influence of adequately filtered and
39
 
40
  ### Out-of-Scope Use
41
 
42
- Production use without adequate assessment of risks and mitigation; any use cases which may be considered irresponsible or harmful
 
 
43
 
44
  ## Bias, Risks, and Limitations
45
 
46
- Falcon-RW models are trained on English data only, and will not generalize appropriately to other languages. Furthermore, as they are trained on a large-scale corpora representative of the web, they will carry the stereotypes and biases commonly encountered online
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
47
 
 
 
48
 
49
- ## Paper
50
 
51
- More details coming soon in the paper.
 
9
 
10
  **Falcon-RW-1B is a 1B parameters causal decoder-only model built by [TII](https://www.tii.ae) and trained on 350B tokens of [RefinedWeb](https://huggingface.co/datasets/tiiuae/falcon-refinedweb). It is made available under the [TII Falcon LLM License](https://huggingface.co/tiiuae/falcon-rw-1b/blob/main/LICENSE.txt).**
11
 
12
+ *Paper coming soon 😊.*
13
+
14
  RefinedWeb is a high-quality web dataset built by leveraging stringent filtering and large-scale deduplication. Falcon-RW-1B, trained on RefinedWeb only, matches or outperforms comparable models trained on curated data.
15
 
16
+ ⚠️ This model is intended for use as a **research artifact**, to study the influence of training on web data alone. **If you are interested in state-of-the-art models, we recommend using Falcon-[7B](https://huggingface.co/tiiuae/falcon-7b)/[40B](https://huggingface.co/tiiuae/falcon-40b), both trained on >1,000 billion tokens.**
17
 
18
+ ```python
19
+ from transformers import AutoTokenizer, AutoModelForCausalLM
20
+ import transformers
21
+ import torch
22
 
23
+ model = "tiiuae/falcon-rw-1b"
24
+
25
+ tokenizer = AutoTokenizer.from_pretrained(model)
26
+ pipeline = transformers.pipeline(
27
+ "text-generation",
28
+ model=model,
29
+ tokenizer=tokenizer,
30
+ torch_dtype=torch.bfloat16,
31
+ trust_remote_code=True,
32
+ device_map="auto",
33
+ )
34
+ sequences = pipeline(
35
+ "Girafatron is obsessed with giraffes, the most glorious animal on the face of this Earth. Giraftron believes all other animals are irrelevant when compared to the glorious majesty of the giraffe.\nDaniel: Hello, Girafatron!\nGirafatron:",
36
+ max_length=200,
37
+ do_sample=True,
38
+ top_k=10,
39
+ num_return_sequences=1,
40
+ eos_token_id=tokenizer.eos_token_id,
41
+ )
42
+ for seq in sequences:
43
+ print(f"Result: {seq['generated_text']}")
44
 
45
+ ```
46
+
47
+
48
+
49
+ # Model Card for Falcon-RW-1B
50
 
51
  ## Model Details
52
 
53
  ### Model Description
54
 
55
+ - **Developed by:** [https://www.tii.ae](https://www.tii.ae);
56
+ - **Model type:** Causal decoder-only;
57
+ - **Language(s) (NLP):** English;
58
+ - **License:** [TII Falcon LLM License](https://huggingface.co/tiiuae/falcon-rw-1b/blob/main/LICENSE.txt).
59
 
60
  ### Model Source
61
 
62
+ - **Paper:** *coming soon*.
 
63
 
64
  ## Uses
65
 
 
69
 
70
  ### Out-of-Scope Use
71
 
72
+ Production use without adequate assessment of risks and mitigation; any use cases which may be considered irresponsible or harmful.
73
+
74
+ Broadly speaking, we would recommend Falcon-[7B](https://huggingface.co/tiiuae/falcon-7b)/[40B](https://huggingface.co/tiiuae/falcon-40b) for any use not directly related to research on web data pipelines.
75
 
76
  ## Bias, Risks, and Limitations
77
 
78
+ Falcon-RW-1B is trained on English data only, and will not generalize appropriately to other languages. Furthermore, as it is trained on a large-scale corpora representative of the web, it will carry the stereotypes and biases commonly encountered online.
79
+
80
+ ### Recommendations
81
+
82
+ We recommend users of Falcon-RW-1B to consider finetuning it for the specific set of tasks of interest, and for guardrails and appropriate precautions to be taken for any production use.
83
+
84
+ ## How to Get Started with the Model
85
+
86
+
87
+ ```python
88
+ from transformers import AutoTokenizer, AutoModelForCausalLM
89
+ import transformers
90
+ import torch
91
+
92
+ model = "tiiuae/falcon-rw-1b"
93
+
94
+ tokenizer = AutoTokenizer.from_pretrained(model)
95
+ pipeline = transformers.pipeline(
96
+ "text-generation",
97
+ model=model,
98
+ tokenizer=tokenizer,
99
+ torch_dtype=torch.bfloat16,
100
+ trust_remote_code=True,
101
+ device_map="auto",
102
+ )
103
+ sequences = pipeline(
104
+ "Girafatron is obsessed with giraffes, the most glorious animal on the face of this Earth. Giraftron believes all other animals are irrelevant when compared to the glorious majesty of the giraffe.\nDaniel: Hello, Girafatron!\nGirafatron:",
105
+ max_length=200,
106
+ do_sample=True,
107
+ top_k=10,
108
+ num_return_sequences=1,
109
+ eos_token_id=tokenizer.eos_token_id,
110
+ )
111
+ for seq in sequences:
112
+ print(f"Result: {seq['generated_text']}")
113
+
114
+ ```
115
+
116
+ ## Training Details
117
+
118
+ ### Training Data
119
+
120
+ Falcon-RW-1B was trained on 350B tokens of [RefinedWeb](https://huggingface.co/datasets/tiiuae/falcon-refinedweb), a high-quality filtered and deduplicated web dataset. The data was tokenized with the GPT-2 tokenizer.
121
+
122
+ ### Training Procedure
123
+
124
+ Falcon-RW-1B was trained on 32 A100 40GB GPUs, using only data parallelism with ZeRO.
125
+
126
+ #### Training Hyperparameters
127
+
128
+ Hyperparameters were adapted from the GPT-3 paper ([Brown et al., 2020](https://arxiv.org/abs/2005.14165)).
129
+
130
+ - **Precision:** bf16;
131
+ - **Optimizer:** Adam;
132
+ - **Learning rate:** 2e-4 (500M tokens warm-up, followed by cosine decay to 2e-5);
133
+ - **Weight decay:** 0.1;
134
+ - **Batch size:** 512 (with a 4B tokens ramp-up).
135
+
136
+ #### Speeds, Sizes, Times [optional]
137
+
138
+ Training happened in early December 2022 and took about six days.
139
+
140
+
141
+ ## Evaluation
142
+
143
+ *Paper coming soon.*
144
+
145
+
146
+ ## Technical Specifications
147
+
148
+ ### Model Architecture and Objective
149
+
150
+ Falcon-RW-1B is a causal decoder-only model trained on a causal language modeling task (i.e., predict the next token).
151
+
152
+ ### Compute Infrastructure
153
+
154
+ #### Hardware
155
+
156
+ Falcon-RW-1B was trained on AWS SageMaker, on 32 A100 40GB GPUs in P4d instances.
157
+
158
+ #### Software
159
+
160
+ Falcon-RW-1B was trained a custom distributed training codebase, Gigatron. It uses a 3D parallelism approach combined with ZeRO and high-performance Triton kernels (FlashAttention, etc.)
161
+
162
+
163
+ ## Citation
164
+
165
+ *Paper coming soon 😊.*
166
+
167
 
168
+ ## Contact
169
170
 
 
171