ArtusDev commited on
Commit
244bc0c
·
verified ·
1 Parent(s): 4d09bac

Upload folder using huggingface_hub

Browse files
.gitattributes CHANGED
@@ -33,3 +33,5 @@ saved_model/**/* filter=lfs diff=lfs merge=lfs -text
33
  *.zip filter=lfs diff=lfs merge=lfs -text
34
  *.zst filter=lfs diff=lfs merge=lfs -text
35
  *tfevents* filter=lfs diff=lfs merge=lfs -text
 
 
 
33
  *.zip filter=lfs diff=lfs merge=lfs -text
34
  *.zst filter=lfs diff=lfs merge=lfs -text
35
  *tfevents* filter=lfs diff=lfs merge=lfs -text
36
+ tekken.json filter=lfs diff=lfs merge=lfs -text
37
+ tokenizer.json filter=lfs diff=lfs merge=lfs -text
README.md ADDED
@@ -0,0 +1,690 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ language:
3
+ - en
4
+ - fr
5
+ - de
6
+ - es
7
+ - pt
8
+ - it
9
+ - ja
10
+ - ko
11
+ - ru
12
+ - zh
13
+ - ar
14
+ - fa
15
+ - id
16
+ - ms
17
+ - ne
18
+ - pl
19
+ - ro
20
+ - sr
21
+ - sv
22
+ - tr
23
+ - uk
24
+ - vi
25
+ - hi
26
+ - bn
27
+ license: apache-2.0
28
+ library_name: vllm
29
+ inference: false
30
+ base_model:
31
+ - mistralai/Mistral-Small-3.2-24B-Instruct-2506
32
+ pipeline_tag: image-text-to-text
33
+ ---
34
+
35
+ # Mistral-Small-3.2-24B-Instruct-2506
36
+
37
+ Mistral-Small-3.2-24B-Instruct-2506 is a minor update of [Mistral-Small-3.1-24B-Instruct-2503](https://huggingface.co/mistralai/Mistral-Small-3.1-24B-Base-2503).
38
+
39
+ Small-3.2 improves in the following categories:
40
+ - **Instruction following**: Small-3.2 is better at following precise instructions
41
+ - **Repetition errors**: Small-3.2 produces less infinite generations or repetitive answers
42
+ - **Function calling**: Small-3.2's function calling template is more robust (see [here](https://github.com/mistralai/mistral-common/blob/535b4d0a0fc94674ea17db6cf8dc2079b81cbcfa/src/mistral_common/tokens/tokenizers/instruct.py#L778) and [examples](#function-calling))
43
+
44
+ In all other categories Small-3.2 should match or slightly improve compared to [Mistral-Small-3.1-24B-Instruct-2503](https://huggingface.co/mistralai/Mistral-Small-3.1-24B-Base-2503).
45
+
46
+ ## Key Features
47
+ - same as [Mistral-Small-3.1-24B-Instruct-2503](https://huggingface.co/mistralai/Mistral-Small-3.1-24B-Base-2503#key-features)
48
+
49
+ ## Benchmark Results
50
+
51
+ We compare Mistral-Small-3.2-24B to [Mistral-Small-3.1-24B-Instruct-2503](https://huggingface.co/mistralai/Mistral-Small-3.1-24B-Base-2503).
52
+ For more comparison against other models of similar size, please check [Mistral-Small-3.1's Benchmarks'](https://huggingface.co/mistralai/Mistral-Small-3.1-24B-Base-2503#benchmark-results)
53
+
54
+ ### Text
55
+
56
+ #### Instruction Following / Chat / Tone
57
+
58
+ | Model | Wildbench v2 | Arena Hard v2 | IF (Internal; accuracy) |
59
+ |-------|---------------|---------------|------------------------|
60
+ | Small 3.1 24B Instruct | 55.6% | 19.56% | 82.75% |
61
+ | **Small 3.2 24B Instruct** | **65.33%** | **43.1%** | **84.78%** |
62
+
63
+ #### Infinite Generations
64
+
65
+ Small 3.2 reduces infitine generations by 2x on challenging, long and repetitive prompts.
66
+
67
+ | Model | Infinite Generations (Internal; Lower is better) |
68
+ |-------|-------|
69
+ | Small 3.1 24B Instruct | 2.11% |
70
+ | **Small 3.2 24B Instruct** | **1.29%** |
71
+
72
+ #### STEM
73
+
74
+ | Model | MMLU | MMLU Pro (5-shot CoT) | MATH | GPQA Main (5-shot CoT) | GPQA Diamond (5-shot CoT )| MBPP Plus - Pass@5 | HumanEval Plus - Pass@5 | SimpleQA (TotalAcc)|
75
+ |--------------------------------|-----------|-----------------------|------------------------|------------------------|---------------------------|--------------------|-------------------------|--------------------|
76
+ | Small 3.1 24B Instruct | 80.62% | 66.76% | 69.30% | 44.42% | 45.96% | 74.63% | 88.99% | 10.43% |
77
+ | **Small 3.2 24B Instruct** | 80.50% | **69.06%** | 69.42% | 44.22% | 46.13% | **78.33%** | **92.90%** | **12.10%** |
78
+
79
+ ### Vision
80
+
81
+ | Model | MMMU | Mathvista | ChartQA | DocVQA | AI2D |
82
+ |--------------------------------|------------|-----------|-----------|-----------|-----------|
83
+ | Small 3.1 24B Instruct | **64.00%** | **68.91%**| 86.24% | 94.08% | 93.72% |
84
+ | **Small 3.2 24B Instruct** | 62.50% | 67.09% | **87.4%** | 94.86% | 92.91% |
85
+
86
+
87
+ ## Usage
88
+
89
+ The model can be used with the following frameworks;
90
+ - [`vllm (recommended)`](https://github.com/vllm-project/vllm): See [here](#vllm-recommended)
91
+ - [`transformers`](https://github.com/huggingface/transformers): See [here](#transformers)
92
+
93
+ **Note 1**: We recommend using a relatively low temperature, such as `temperature=0.15`.
94
+
95
+ **Note 2**: Make sure to add a system prompt to the model to best tailer it for your needs. If you want to use the model as a general assistant, we recommend to use the one provided in the [SYSTEM_PROMPT.txt](https://huggingface.co/mistralai/Mistral-Small-3.2-24B-Instruct-2506/blob/main/SYSTEM_PROMPT.txt) file.
96
+
97
+ ### vLLM (recommended)
98
+
99
+ We recommend using this model with [vLLM](https://github.com/vllm-project/vllm).
100
+
101
+ #### Installation
102
+
103
+ Make sure to install [`vLLM >= 0.9.1`](https://github.com/vllm-project/vllm/releases/tag/v0.9.1):
104
+
105
+ ```
106
+ pip install vllm --upgrade
107
+ ```
108
+
109
+ Doing so should automatically install [`mistral_common >= 1.6.2`](https://github.com/mistralai/mistral-common/releases/tag/v1.6.2).
110
+
111
+ To check:
112
+ ```
113
+ python -c "import mistral_common; print(mistral_common.__version__)"
114
+ ```
115
+
116
+ You can also make use of a ready-to-go [docker image](https://github.com/vllm-project/vllm/blob/main/Dockerfile) or on the [docker hub](https://hub.docker.com/layers/vllm/vllm-openai/latest/images/sha256-de9032a92ffea7b5c007dad80b38fd44aac11eddc31c435f8e52f3b7404bbf39).
117
+
118
+ #### Serve
119
+
120
+ We recommand that you use Mistral-Small-3.2-24B-Instruct-2506 in a server/client setting.
121
+
122
+ 1. Spin up a server:
123
+
124
+ ```
125
+ vllm serve mistralai/Mistral-Small-3.2-24B-Instruct-2506 --tokenizer_mode mistral --config_format mistral --load_format mistral --tool-call-parser mistral --enable-auto-tool-choice --limit_mm_per_prompt 'image=10' --tensor-parallel-size 2
126
+ ```
127
+
128
+ **Note:** Running Mistral-Small-3.2-24B-Instruct-2506 on GPU requires ~55 GB of GPU RAM in bf16 or fp16.
129
+
130
+
131
+ 2. To ping the client you can use a simple Python snippet. See the following examples.
132
+
133
+
134
+ #### Vision reasoning
135
+
136
+ Take leverage of the vision capabilities of Mistral-Small-3.2-24B-Instruct-2506 to take the best choice given a scenario, go catch them all !
137
+
138
+ <details>
139
+ <summary>Python snippet</summary>
140
+
141
+ ```py
142
+ from datetime import datetime, timedelta
143
+
144
+ from openai import OpenAI
145
+ from huggingface_hub import hf_hub_download
146
+
147
+ # Modify OpenAI's API key and API base to use vLLM's API server.
148
+ openai_api_key = "EMPTY"
149
+ openai_api_base = "http://localhost:8000/v1"
150
+
151
+ TEMP = 0.15
152
+ MAX_TOK = 131072
153
+
154
+ client = OpenAI(
155
+ api_key=openai_api_key,
156
+ base_url=openai_api_base,
157
+ )
158
+
159
+ models = client.models.list()
160
+ model = models.data[0].id
161
+
162
+
163
+ def load_system_prompt(repo_id: str, filename: str) -> str:
164
+ file_path = hf_hub_download(repo_id=repo_id, filename=filename)
165
+ with open(file_path, "r") as file:
166
+ system_prompt = file.read()
167
+ today = datetime.today().strftime("%Y-%m-%d")
168
+ yesterday = (datetime.today() - timedelta(days=1)).strftime("%Y-%m-%d")
169
+ model_name = repo_id.split("/")[-1]
170
+ return system_prompt.format(name=model_name, today=today, yesterday=yesterday)
171
+
172
+
173
+ model_id = "mistralai/Mistral-Small-3.2-24B-Instruct-2506"
174
+ SYSTEM_PROMPT = load_system_prompt(model_id, "SYSTEM_PROMPT.txt")
175
+ image_url = "https://static.wikia.nocookie.net/essentialsdocs/images/7/70/Battle.png/revision/latest?cb=20220523172438"
176
+
177
+ messages = [
178
+ {"role": "system", "content": SYSTEM_PROMPT},
179
+ {
180
+ "role": "user",
181
+ "content": [
182
+ {
183
+ "type": "text",
184
+ "text": "What action do you think I should take in this situation? List all the possible actions and explain why you think they are good or bad.",
185
+ },
186
+ {"type": "image_url", "image_url": {"url": image_url}},
187
+ ],
188
+ },
189
+ ]
190
+
191
+
192
+ response = client.chat.completions.create(
193
+ model=model,
194
+ messages=messages,
195
+ temperature=TEMP,
196
+ max_tokens=MAX_TOK,
197
+ )
198
+
199
+ print(response.choices[0].message.content)
200
+ # In this situation, you are playing a Pokémon game where your Pikachu (Level 42) is facing a wild Pidgey (Level 17). Here are the possible actions you can take and an analysis of each:
201
+
202
+ # 1. **FIGHT**:
203
+ # - **Pros**: Pikachu is significantly higher level than the wild Pidgey, which suggests that it should be able to defeat Pidgey easily. This could be a good opportunity to gain experience points and possibly items or money.
204
+ # - **Cons**: There is always a small risk of Pikachu fainting, especially if Pidgey has a powerful move or a status effect that could hinder Pikachu. However, given the large level difference, this risk is minimal.
205
+
206
+ # 2. **BAG**:
207
+ # - **Pros**: You might have items in your bag that could help in this battle, such as Potions, Poké Balls, or Berries. Using an item could help you capture the Pidgey or heal your Pikachu if needed.
208
+ # - **Cons**: Using items might not be necessary given the level difference. It could be more efficient to just fight and defeat the Pidgey quickly.
209
+
210
+ # 3. **POKÉMON**:
211
+ # - **Pros**: You might have another Pokémon in your party that is better suited for this battle or that you want to gain experience. Switching Pokémon could also be a strategic move if you want to train a lower-level Pokémon.
212
+ # - **Cons**: Switching Pokémon might not be necessary since Pikachu is at a significant advantage. It could also waste time and potentially give Pidgey a turn to attack.
213
+
214
+ # 4. **RUN**:
215
+ # - **Pros**: Running away could save time and conserve your Pokémon's health and resources. If you are in a hurry or do not need the experience or items, running away is a safe option.
216
+ # - **Cons**: Running away means you miss out on the experience points and potential items or money that you could gain from defeating the Pidgey. It also means you do not get the chance to capture the Pidgey if you wanted to.
217
+
218
+ # ### Recommendation:
219
+ # Given the significant level advantage, the best action is likely to **FIGHT**. This will allow you to quickly defeat the Pidgey, gain experience points, and potentially earn items or money. If you are concerned about Pikachu's health, you could use an item from your **BAG** to heal it before or during the battle. Running away or switching Pokémon does not seem necessary in this situation.
220
+ ```
221
+ </details>
222
+
223
+ #### Function calling
224
+
225
+ Mistral-Small-3.2-24B-Instruct-2506 is excellent at function / tool calling tasks via vLLM. *E.g.:*
226
+
227
+ <details>
228
+ <summary>Python snippet - easy</summary>
229
+
230
+ ```py
231
+ from openai import OpenAI
232
+ from huggingface_hub import hf_hub_download
233
+
234
+ # Modify OpenAI's API key and API base to use vLLM's API server.
235
+ openai_api_key = "EMPTY"
236
+ openai_api_base = "http://localhost:8000/v1"
237
+
238
+ TEMP = 0.15
239
+ MAX_TOK = 131072
240
+
241
+ client = OpenAI(
242
+ api_key=openai_api_key,
243
+ base_url=openai_api_base,
244
+ )
245
+
246
+ models = client.models.list()
247
+ model = models.data[0].id
248
+
249
+ def load_system_prompt(repo_id: str, filename: str) -> str:
250
+ file_path = hf_hub_download(repo_id=repo_id, filename=filename)
251
+ with open(file_path, "r") as file:
252
+ system_prompt = file.read()
253
+ return system_prompt
254
+
255
+ model_id = "mistralai/Mistral-Small-3.2-24B-Instruct-2506"
256
+ SYSTEM_PROMPT = load_system_prompt(model_id, "SYSTEM_PROMPT.txt")
257
+
258
+ image_url = "https://huggingface.co/datasets/patrickvonplaten/random_img/resolve/main/europe.png"
259
+
260
+ tools = [
261
+ {
262
+ "type": "function",
263
+ "function": {
264
+ "name": "get_current_population",
265
+ "description": "Get the up-to-date population of a given country.",
266
+ "parameters": {
267
+ "type": "object",
268
+ "properties": {
269
+ "country": {
270
+ "type": "string",
271
+ "description": "The country to find the population of.",
272
+ },
273
+ "unit": {
274
+ "type": "string",
275
+ "description": "The unit for the population.",
276
+ "enum": ["millions", "thousands"],
277
+ },
278
+ },
279
+ "required": ["country", "unit"],
280
+ },
281
+ },
282
+ },
283
+ {
284
+ "type": "function",
285
+ "function": {
286
+ "name": "rewrite",
287
+ "description": "Rewrite a given text for improved clarity",
288
+ "parameters": {
289
+ "type": "object",
290
+ "properties": {
291
+ "text": {
292
+ "type": "string",
293
+ "description": "The input text to rewrite",
294
+ }
295
+ },
296
+ },
297
+ },
298
+ },
299
+ ]
300
+
301
+ messages = [
302
+ {"role": "system", "content": SYSTEM_PROMPT},
303
+ {
304
+ "role": "user",
305
+ "content": "Could you please make the below article more concise?\n\nOpenAI is an artificial intelligence research laboratory consisting of the non-profit OpenAI Incorporated and its for-profit subsidiary corporation OpenAI Limited Partnership.",
306
+ },
307
+ {
308
+ "role": "assistant",
309
+ "content": "",
310
+ "tool_calls": [
311
+ {
312
+ "id": "bbc5b7ede",
313
+ "type": "function",
314
+ "function": {
315
+ "name": "rewrite",
316
+ "arguments": '{"text": "OpenAI is an artificial intelligence research laboratory consisting of the non-profit OpenAI Incorporated and its for-profit subsidiary corporation OpenAI Limited Partnership."}',
317
+ },
318
+ }
319
+ ],
320
+ },
321
+ {
322
+ "role": "tool",
323
+ "content": '{"action":"rewrite","outcome":"OpenAI is a FOR-profit company."}',
324
+ "tool_call_id": "bbc5b7ede",
325
+ "name": "rewrite",
326
+ },
327
+ {
328
+ "role": "assistant",
329
+ "content": "---\n\nOpenAI is a FOR-profit company.",
330
+ },
331
+ {
332
+ "role": "user",
333
+ "content": [
334
+ {
335
+ "type": "text",
336
+ "text": "Can you tell me what is the biggest country depicted on the map?",
337
+ },
338
+ {
339
+ "type": "image_url",
340
+ "image_url": {
341
+ "url": image_url,
342
+ },
343
+ },
344
+ ],
345
+ }
346
+ ]
347
+
348
+ response = client.chat.completions.create(
349
+ model=model,
350
+ messages=messages,
351
+ temperature=TEMP,
352
+ max_tokens=MAX_TOK,
353
+ tools=tools,
354
+ tool_choice="auto",
355
+ )
356
+
357
+ assistant_message = response.choices[0].message.content
358
+ print(assistant_message)
359
+ # The biggest country depicted on the map is Russia.
360
+
361
+ messages.extend([
362
+ {"role": "assistant", "content": assistant_message},
363
+ {"role": "user", "content": "What is the population of that country in millions?"},
364
+ ])
365
+
366
+ response = client.chat.completions.create(
367
+ model=model,
368
+ messages=messages,
369
+ temperature=TEMP,
370
+ max_tokens=MAX_TOK,
371
+ tools=tools,
372
+ tool_choice="auto",
373
+ )
374
+
375
+ print(response.choices[0].message.tool_calls)
376
+ # [ChatCompletionMessageToolCall(id='3e92V6Vfo', function=Function(arguments='{"country": "Russia", "unit": "millions"}', name='get_current_population'), type='function')]
377
+ ```
378
+
379
+ </details>
380
+
381
+ <details>
382
+ <summary>Python snippet - complex</summary>
383
+
384
+ ```python
385
+ import json
386
+ from openai import OpenAI
387
+ from huggingface_hub import hf_hub_download
388
+
389
+ # Modify OpenAI's API key and API base to use vLLM's API server.
390
+ openai_api_key = "EMPTY"
391
+ openai_api_base = "http://localhost:8000/v1"
392
+
393
+ TEMP = 0.15
394
+ MAX_TOK = 131072
395
+
396
+ client = OpenAI(
397
+ api_key=openai_api_key,
398
+ base_url=openai_api_base,
399
+ )
400
+
401
+ models = client.models.list()
402
+ model = models.data[0].id
403
+
404
+
405
+ def load_system_prompt(repo_id: str, filename: str) -> str:
406
+ file_path = hf_hub_download(repo_id=repo_id, filename=filename)
407
+ with open(file_path, "r") as file:
408
+ system_prompt = file.read()
409
+ return system_prompt
410
+
411
+
412
+ model_id = "mistralai/Mistral-Small-3.2-24B-Instruct-2506"
413
+ SYSTEM_PROMPT = load_system_prompt(model_id, "SYSTEM_PROMPT.txt")
414
+
415
+ image_url = "https://math-coaching.com/img/fiche/46/expressions-mathematiques.jpg"
416
+
417
+
418
+ def my_calculator(expression: str) -> str:
419
+ return str(eval(expression))
420
+
421
+
422
+ tools = [
423
+ {
424
+ "type": "function",
425
+ "function": {
426
+ "name": "my_calculator",
427
+ "description": "A calculator that can evaluate a mathematical expression.",
428
+ "parameters": {
429
+ "type": "object",
430
+ "properties": {
431
+ "expression": {
432
+ "type": "string",
433
+ "description": "The mathematical expression to evaluate.",
434
+ },
435
+ },
436
+ "required": ["expression"],
437
+ },
438
+ },
439
+ },
440
+ {
441
+ "type": "function",
442
+ "function": {
443
+ "name": "rewrite",
444
+ "description": "Rewrite a given text for improved clarity",
445
+ "parameters": {
446
+ "type": "object",
447
+ "properties": {
448
+ "text": {
449
+ "type": "string",
450
+ "description": "The input text to rewrite",
451
+ }
452
+ },
453
+ },
454
+ },
455
+ },
456
+ ]
457
+
458
+ messages = [
459
+ {"role": "system", "content": SYSTEM_PROMPT},
460
+ {
461
+ "role": "user",
462
+ "content": [
463
+ {
464
+ "type": "text",
465
+ "text": "Can you calculate the results for all the equations displayed in the image? Only compute the ones that involve numbers.",
466
+ },
467
+ {
468
+ "type": "image_url",
469
+ "image_url": {
470
+ "url": image_url,
471
+ },
472
+ },
473
+ ],
474
+ },
475
+ ]
476
+
477
+ response = client.chat.completions.create(
478
+ model=model,
479
+ messages=messages,
480
+ temperature=TEMP,
481
+ max_tokens=MAX_TOK,
482
+ tools=tools,
483
+ tool_choice="auto",
484
+ )
485
+
486
+ tool_calls = response.choices[0].message.tool_calls
487
+ print(tool_calls)
488
+ # [ChatCompletionMessageToolCall(id='CyQBSAtGh', function=Function(arguments='{"expression": "6 + 2 * 3"}', name='my_calculator'), type='function'), ChatCompletionMessageToolCall(id='KQqRCqvzc', function=Function(arguments='{"expression": "19 - (8 + 2) + 1"}', name='my_calculator'), type='function')]
489
+
490
+ results = []
491
+ for tool_call in tool_calls:
492
+ function_name = tool_call.function.name
493
+ function_args = tool_call.function.arguments
494
+ if function_name == "my_calculator":
495
+ result = my_calculator(**json.loads(function_args))
496
+ results.append(result)
497
+
498
+ messages.append({"role": "assistant", "tool_calls": tool_calls})
499
+ for tool_call, result in zip(tool_calls, results):
500
+ messages.append(
501
+ {
502
+ "role": "tool",
503
+ "tool_call_id": tool_call.id,
504
+ "name": tool_call.function.name,
505
+ "content": result,
506
+ }
507
+ )
508
+
509
+
510
+ response = client.chat.completions.create(
511
+ model=model,
512
+ messages=messages,
513
+ temperature=TEMP,
514
+ max_tokens=MAX_TOK,
515
+ )
516
+
517
+ print(response.choices[0].message.content)
518
+ # Here are the results for the equations that involve numbers:
519
+
520
+ # 1. \( 6 + 2 \times 3 = 12 \)
521
+ # 3. \( 19 - (8 + 2) + 1 = 10 \)
522
+
523
+ # For the other equations, you need to substitute the variables with specific values to compute the results.
524
+ ```
525
+
526
+ </details>
527
+
528
+ #### Instruction following
529
+
530
+ Mistral-Small-3.2-24B-Instruct-2506 will follow your instructions down to the last letter !
531
+
532
+ <details>
533
+ <summary>Python snippet</summary>
534
+
535
+ ```python
536
+ from openai import OpenAI
537
+ from huggingface_hub import hf_hub_download
538
+
539
+ # Modify OpenAI's API key and API base to use vLLM's API server.
540
+ openai_api_key = "EMPTY"
541
+ openai_api_base = "http://localhost:8000/v1"
542
+
543
+ TEMP = 0.15
544
+ MAX_TOK = 131072
545
+
546
+ client = OpenAI(
547
+ api_key=openai_api_key,
548
+ base_url=openai_api_base,
549
+ )
550
+
551
+ models = client.models.list()
552
+ model = models.data[0].id
553
+
554
+
555
+ def load_system_prompt(repo_id: str, filename: str) -> str:
556
+ file_path = hf_hub_download(repo_id=repo_id, filename=filename)
557
+ with open(file_path, "r") as file:
558
+ system_prompt = file.read()
559
+ return system_prompt
560
+
561
+
562
+ model_id = "mistralai/Mistral-Small-3.2-24B-Instruct-2506"
563
+ SYSTEM_PROMPT = load_system_prompt(model_id, "SYSTEM_PROMPT.txt")
564
+
565
+ messages = [
566
+ {"role": "system", "content": SYSTEM_PROMPT},
567
+ {
568
+ "role": "user",
569
+ "content": "Write me a sentence where every word starts with the next letter in the alphabet - start with 'a' and end with 'z'.",
570
+ },
571
+ ]
572
+
573
+ response = client.chat.completions.create(
574
+ model=model,
575
+ messages=messages,
576
+ temperature=TEMP,
577
+ max_tokens=MAX_TOK,
578
+ )
579
+
580
+ assistant_message = response.choices[0].message.content
581
+ print(assistant_message)
582
+
583
+ # Here's a sentence where each word starts with the next letter of the alphabet, starting from 'a' and ending with 'z':
584
+
585
+ # "Always brave cats dance elegantly, fluffy giraffes happily ignore jungle kites, lovingly munching nuts, observing playful quails racing swiftly, tiny unicorns vaulting while xylophones yodel zealously."
586
+
587
+ # This sentence follows the sequence from A to Z without skipping any letters.
588
+ ```
589
+ </details>
590
+
591
+ ### Transformers
592
+
593
+ You can also use Mistral-Small-3.2-24B-Instruct-2506 with `Transformers` !
594
+
595
+ To make the best use of our model with `Transformers` make sure to have [installed](https://github.com/mistralai/mistral-common) `mistral-common >= 1.6.2` to use our tokenizer.
596
+
597
+ ```bash
598
+ pip install mistral-common --upgrade
599
+ ```
600
+
601
+ Then load our tokenizer along with the model and generate:
602
+
603
+ <details>
604
+ <summary>Python snippet</summary>
605
+
606
+ ```python
607
+ from datetime import datetime, timedelta
608
+ import torch
609
+
610
+ from mistral_common.protocol.instruct.request import ChatCompletionRequest
611
+ from mistral_common.tokens.tokenizers.mistral import MistralTokenizer
612
+ from huggingface_hub import hf_hub_download
613
+ from transformers import Mistral3ForConditionalGeneration
614
+
615
+
616
+ def load_system_prompt(repo_id: str, filename: str) -> str:
617
+ file_path = hf_hub_download(repo_id=repo_id, filename=filename)
618
+ with open(file_path, "r") as file:
619
+ system_prompt = file.read()
620
+ today = datetime.today().strftime("%Y-%m-%d")
621
+ yesterday = (datetime.today() - timedelta(days=1)).strftime("%Y-%m-%d")
622
+ model_name = repo_id.split("/")[-1]
623
+ return system_prompt.format(name=model_name, today=today, yesterday=yesterday)
624
+
625
+
626
+ model_id = "mistralai/Mistral-Small-3.2-24B-Instruct-2506"
627
+ SYSTEM_PROMPT = load_system_prompt(model_id, "SYSTEM_PROMPT.txt")
628
+
629
+ tokenizer = MistralTokenizer.from_hf_hub(model_id)
630
+
631
+ model = Mistral3ForConditionalGeneration.from_pretrained(
632
+ model_id, torch_dtype=torch.bfloat16
633
+ )
634
+
635
+ image_url = "https://static.wikia.nocookie.net/essentialsdocs/images/7/70/Battle.png/revision/latest?cb=20220523172438"
636
+
637
+ messages = [
638
+ {"role": "system", "content": SYSTEM_PROMPT},
639
+ {
640
+ "role": "user",
641
+ "content": [
642
+ {
643
+ "type": "text",
644
+ "text": "What action do you think I should take in this situation? List all the possible actions and explain why you think they are good or bad.",
645
+ },
646
+ {"type": "image_url", "image_url": {"url": image_url}},
647
+ ],
648
+ },
649
+ ]
650
+
651
+ tokenized = tokenizer.encode_chat_completion(ChatCompletionRequest(messages=messages))
652
+
653
+ input_ids = torch.tensor([tokenized.tokens])
654
+ attention_mask = torch.ones_like(input_ids)
655
+ pixel_values = torch.tensor(tokenized.images[0], dtype=torch.bfloat16).unsqueeze(0)
656
+ image_sizes = torch.tensor([pixel_values.shape[-2:]])
657
+
658
+ output = model.generate(
659
+ input_ids=input_ids,
660
+ attention_mask=attention_mask,
661
+ pixel_values=pixel_values,
662
+ image_sizes=image_sizes,
663
+ max_new_tokens=1000,
664
+ )[0]
665
+
666
+ decoded_output = tokenizer.decode(output[len(tokenized.tokens) :])
667
+ print(decoded_output)
668
+ # In this situation, you are playing a Pokémon game where your Pikachu (Level 42) is facing a wild Pidgey (Level 17). Here are the possible actions you can take and an analysis of each:
669
+
670
+ # 1. **FIGHT**:
671
+ # - **Pros**: Pikachu is significantly higher level than the wild Pidgey, which suggests that it should be able to defeat Pidgey easily. This could be a good opportunity to gain experience points and possibly items or money.
672
+ # - **Cons**: There is always a small risk of Pikachu fainting, especially if Pidgey has a powerful move or a status effect that could hinder Pikachu. However, given the large level difference, this risk is minimal.
673
+
674
+ # 2. **BAG**:
675
+ # - **Pros**: You might have items in your bag that could help in this battle, such as Potions, Poké Balls, or Berries. Using an item could help you capture Pidgey or heal Pikachu if needed.
676
+ # - **Cons**: Using items might not be necessary given the level difference. It could be more efficient to just fight and defeat Pidgey quickly.
677
+
678
+ # 3. **POKÉMON**:
679
+ # - **Pros**: You might have another Pokémon in your party that is better suited for this battle or that you want to gain experience. Switching Pokémon could also be strategic if you want to train a lower-level Pokémon.
680
+ # - **Cons**: Switching Pokémon might not be necessary since Pikachu is at a significant advantage. It could also waste time and potentially give Pidgey a turn to attack.
681
+
682
+ # 4. **RUN**:
683
+ # - **Pros**: Running away could be a quick way to avoid the battle altogether. This might be useful if you are trying to conserve resources or if you are in a hurry to get to another location.
684
+ # - **Cons**: Running away means you miss out on the experience points, items, or money that you could gain from defeating Pidgey. It also might not be the most efficient use of your time if you are trying to train your Pokémon.
685
+
686
+ # ### Recommendation:
687
+ # Given the significant level advantage, the best action to take is likely **FIGHT**. This will allow you to quickly defeat Pidgey and gain experience points for Pikachu. If you are concerned about Pikachu's health, you could use the **BAG** to heal Pikachu before or during the battle. Running away or switching Pokémon does not seem necessary in this situation.
688
+ ```
689
+
690
+ </details>
SYSTEM_PROMPT.txt ADDED
@@ -0,0 +1,29 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ You are {name}, a Large Language Model (LLM) created by Mistral AI, a French startup headquartered in Paris.
2
+ You power an AI assistant called Le Chat.
3
+ Your knowledge base was last updated on 2023-10-01.
4
+ The current date is {today}.
5
+
6
+ When you're not sure about some information or when the user's request requires up-to-date or specific data, you must use the available tools to fetch the information. Do not hesitate to use tools whenever they can provide a more accurate or complete response. If no relevant tools are available, then clearly state that you don't have the information and avoid making up anything.
7
+ If the user's question is not clear, ambiguous, or does not provide enough context for you to accurately answer the question, you do not try to answer it right away and you rather ask the user to clarify their request (e.g. "What are some good restaurants around me?" => "Where are you?" or "When is the next flight to Tokyo" => "Where do you travel from?").
8
+ You are always very attentive to dates, in particular you try to resolve dates (e.g. "yesterday" is {yesterday}) and when asked about information at specific dates, you discard information that is at another date.
9
+ You follow these instructions in all languages, and always respond to the user in the language they use or request.
10
+ Next sections describe the capabilities that you have.
11
+
12
+ # WEB BROWSING INSTRUCTIONS
13
+
14
+ You cannot perform any web search or access internet to open URLs, links etc. If it seems like the user is expecting you to do so, you clarify the situation and ask the user to copy paste the text directly in the chat.
15
+
16
+ # MULTI-MODAL INSTRUCTIONS
17
+
18
+ You have the ability to read images, but you cannot generate images. You also cannot transcribe audio files or videos.
19
+ You cannot read nor transcribe audio files or videos.
20
+
21
+ TOOL CALLING INSTRUCTIONS
22
+
23
+ You may have access to tools that you can use to fetch information or perform actions. You must use these tools in the following situations:
24
+
25
+ 1. When the request requires up-to-date information.
26
+ 2. When the request requires specific data that you do not have in your knowledge base.
27
+ 3. When the request involves actions that you cannot perform without tools.
28
+
29
+ Always prioritize using tools to provide the most accurate and helpful response. If tools are not available, inform the user that you cannot perform the requested action at the moment.
chat_template.jinja ADDED
@@ -0,0 +1,176 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {%- set today_date = strftime_now("%Y-%m-%d") %}
2
+ {%- set yesterday_day = strftime_now("%d") %}
3
+ {%- set yesterday_month = strftime_now("%m") %}
4
+ {%- set yesterday_year = strftime_now("%Y") %}
5
+ {%- if yesterday_day == '01' %}
6
+ {#- Jinja doesnt allow minus 1 date - Unsloth alternative #}
7
+ {%- if yesterday_month == '01' %}
8
+ {%- set yesterday_day = '31' %}
9
+ {%- set yesterday_month = '12' %}
10
+ {%- if yesterday_year == '2024' %}
11
+ {%- set yesterday_year = '2023' %}
12
+ {%- elif yesterday_year == '2025' %}
13
+ {%- set yesterday_year = '2024' %}
14
+ {%- elif yesterday_year == '2026' %}
15
+ {%- set yesterday_year = '2025' %}
16
+ {%- elif yesterday_year == '2027' %}
17
+ {%- set yesterday_year = '2026' %}
18
+ {%- elif yesterday_year == '2028' %}
19
+ {%- set yesterday_year = '2027' %}
20
+ {%- else %}
21
+ {{- raise_exception('Unsloth custom template does not support years > 2028') }}
22
+ {%- endif %}
23
+ {%- elif yesterday_month == '02' %}
24
+ {%- set yesterday_day = '31' %}
25
+ {%- set yesterday_month = '01' %}
26
+ {%- elif yesterday_month == '03' %}
27
+ {%- set yesterday_month = '02' %}
28
+ {%- set yesterday_day = '28' %}
29
+ {%- if yesterday_year == '2024' %}
30
+ {%- set yesterday_day = '29' %}
31
+ {%- elif yesterday_year == '2028' %}
32
+ {%- set yesterday_day = '29' %}
33
+ {%- else %}
34
+ {{- raise_exception('Unsloth custom template does not support years > 2028') }}
35
+ {%- endif %}
36
+ {%- elif yesterday_month == '04' %}
37
+ {%- set yesterday_day = '31' %}
38
+ {%- set yesterday_month = '03' %}
39
+ {%- elif yesterday_month == '05' %}
40
+ {%- set yesterday_day = '30' %}
41
+ {%- set yesterday_month = '04' %}
42
+ {%- elif yesterday_month == '06' %}
43
+ {%- set yesterday_day = '31' %}
44
+ {%- set yesterday_month = '05' %}
45
+ {%- elif yesterday_month == '07' %}
46
+ {%- set yesterday_day = '30' %}
47
+ {%- set yesterday_month = '06' %}
48
+ {%- elif yesterday_month == '08' %}
49
+ {%- set yesterday_day = '31' %}
50
+ {%- set yesterday_month = '07' %}
51
+ {%- elif yesterday_month == '09' %}
52
+ {%- set yesterday_day = '31' %}
53
+ {%- set yesterday_month = '08' %}
54
+ {%- elif yesterday_month == '10' %}
55
+ {%- set yesterday_day = '30' %}
56
+ {%- set yesterday_month = '09' %}
57
+ {%- elif yesterday_month == '11' %}
58
+ {%- set yesterday_day = '31' %}
59
+ {%- set yesterday_month = '10' %}
60
+ {%- elif yesterday_month == '12' %}
61
+ {%- set yesterday_day = '30' %}
62
+ {%- set yesterday_month = '11' %}
63
+ {%- endif %}
64
+ {%- elif yesterday_day == '02' %}
65
+ {%- set yesterday_day = '01' %}
66
+ {%- elif yesterday_day == '03' %}
67
+ {%- set yesterday_day = '02' %}
68
+ {%- elif yesterday_day == '04' %}
69
+ {%- set yesterday_day = '03' %}
70
+ {%- elif yesterday_day == '05' %}
71
+ {%- set yesterday_day = '04' %}
72
+ {%- elif yesterday_day == '06' %}
73
+ {%- set yesterday_day = '05' %}
74
+ {%- elif yesterday_day == '07' %}
75
+ {%- set yesterday_day = '06' %}
76
+ {%- elif yesterday_day == '08' %}
77
+ {%- set yesterday_day = '07' %}
78
+ {%- elif yesterday_day == '09' %}
79
+ {%- set yesterday_day = '08' %}
80
+ {%- elif yesterday_day == '10' %}
81
+ {%- set yesterday_day = '09' %}
82
+ {%- elif yesterday_day == '11' %}
83
+ {%- set yesterday_day = '10' %}
84
+ {%- elif yesterday_day == '12' %}
85
+ {%- set yesterday_day = '11' %}
86
+ {%- elif yesterday_day == '13' %}
87
+ {%- set yesterday_day = '12' %}
88
+ {%- elif yesterday_day == '14' %}
89
+ {%- set yesterday_day = '13' %}
90
+ {%- elif yesterday_day == '15' %}
91
+ {%- set yesterday_day = '14' %}
92
+ {%- elif yesterday_day == '16' %}
93
+ {%- set yesterday_day = '15' %}
94
+ {%- elif yesterday_day == '17' %}
95
+ {%- set yesterday_day = '16' %}
96
+ {%- elif yesterday_day == '18' %}
97
+ {%- set yesterday_day = '17' %}
98
+ {%- elif yesterday_day == '19' %}
99
+ {%- set yesterday_day = '18' %}
100
+ {%- elif yesterday_day == '20' %}
101
+ {%- set yesterday_day = '19' %}
102
+ {%- elif yesterday_day == '21' %}
103
+ {%- set yesterday_day = '20' %}
104
+ {%- elif yesterday_day == '22' %}
105
+ {%- set yesterday_day = '21' %}
106
+ {%- elif yesterday_day == '23' %}
107
+ {%- set yesterday_day = '22' %}
108
+ {%- elif yesterday_day == '24' %}
109
+ {%- set yesterday_day = '23' %}
110
+ {%- elif yesterday_day == '25' %}
111
+ {%- set yesterday_day = '24' %}
112
+ {%- elif yesterday_day == '26' %}
113
+ {%- set yesterday_day = '25' %}
114
+ {%- elif yesterday_day == '27' %}
115
+ {%- set yesterday_day = '26' %}
116
+ {%- elif yesterday_day == '28' %}
117
+ {%- set yesterday_day = '27' %}
118
+ {%- elif yesterday_day == '29' %}
119
+ {%- set yesterday_day = '28' %}
120
+ {%- elif yesterday_day == '30' %}
121
+ {%- set yesterday_day = '29' %}
122
+ {%- elif yesterday_day == '31' %}
123
+ {%- set yesterday_day = '30' %}
124
+ {%- endif %}
125
+ {#- Edits made by Unsloth #}
126
+ {%- set yesterday_date = yesterday_year + '-' + yesterday_month + '-' + yesterday_day %}
127
+ {%- set default_system_message = "You are Mistral-Small-3.2-24B-Instruct-2506, a Large Language Model (LLM) created by Mistral AI, a French startup headquartered in Paris.\nYou power an AI assistant called Le Chat.\nYour knowledge base was last updated on 2023-10-01.\nThe current date is " + today_date + ".\n\nWhen you\'re not sure about some information or when the user\'s request requires up-to-date or specific data, you must use the available tools to fetch the information. Do not hesitate to use tools whenever they can provide a more accurate or complete response. If no relevant tools are available, then clearly state that you don\'t have the information and avoid making up anything.\nIf the user\'s question is not clear, ambiguous, or does not provide enough context for you to accurately answer the question, you do not try to answer it right away and you rather ask the user to clarify their request (e.g. \"What are some good restaurants around me?\" => \"Where are you?\" or \"When is the next flight to Tokyo\" => \"Where do you travel from?\").\nYou are always very attentive to dates, in particular you try to resolve dates (e.g. \"yesterday\" is " + yesterday_date + ") and when asked about information at specific dates, you discard information that is at another date.\nYou follow these instructions in all languages, and always respond to the user in the language they use or request.\nNext sections describe the capabilities that you have.\n\n# WEB BROWSING INSTRUCTIONS\n\nYou cannot perform any web search or access internet to open URLs, links etc. If it seems like the user is expecting you to do so, you clarify the situation and ask the user to copy paste the text directly in the chat.\n\n# MULTI-MODAL INSTRUCTIONS\n\nYou have the ability to read images, but you cannot generate images. You also cannot transcribe audio files or videos.\nYou cannot read nor transcribe audio files or videos.\n\nTOOL CALLING INSTRUCTIONS\n\nYou may have access to tools that you can use to fetch information or perform actions. You must use these tools in the following situations:\n\n1. When the request requires up-to-date information.\n2. When the request requires specific data that you do not have in your knowledge base.\n3. When the request involves actions that you cannot perform without tools.\n\nAlways prioritize using tools to provide the most accurate and helpful response. If tools are not available, inform the user that you cannot perform the requested action at the moment." %}
128
+
129
+ {{- bos_token }}
130
+
131
+ {%- if messages[0]['role'] == 'system' %}
132
+ {%- if messages[0]['content'] is string %}
133
+ {%- set system_message = messages[0]['content'] %}
134
+ {%- else %}
135
+ {%- set system_message = messages[0]['content'][0]['text'] %}
136
+ {%- endif %}
137
+ {%- set loop_messages = messages[1:] %}
138
+ {%- else %}
139
+ {%- set system_message = default_system_message %}
140
+ {%- set loop_messages = messages %}
141
+ {%- endif %}
142
+ {{- '[SYSTEM_PROMPT]' + system_message + '[/SYSTEM_PROMPT]' }}
143
+
144
+ {%- for message in loop_messages %}
145
+ {%- if message['role'] == 'user' %}
146
+ {%- if message['content'] is string %}
147
+ {{- '[INST]' + message['content'] + '[/INST]' }}
148
+ {%- else %}
149
+ {{- '[INST]' }}
150
+ {%- for block in message['content'] %}
151
+ {%- if block['type'] == 'text' %}
152
+ {{- block['text'] }}
153
+ {%- elif block['type'] in ['image', 'image_url'] %}
154
+ {{- '[IMG]' }}
155
+ {%- else %}
156
+ {{- raise_exception('Only text and image blocks are supported in message content!') }}
157
+ {%- endif %}
158
+ {%- endfor %}
159
+ {{- '[/INST]' }}
160
+ {%- endif %}
161
+ {%- elif message['role'] == 'system' %}
162
+ {%- if message['content'] is string %}
163
+ {{- '[SYSTEM_PROMPT]' + message['content'] + '[/SYSTEM_PROMPT]' }}
164
+ {%- else %}
165
+ {{- '[SYSTEM_PROMPT]' + message['content'][0]['text'] + '[/SYSTEM_PROMPT]' }}
166
+ {%- endif %}
167
+ {%- elif message['role'] == 'assistant' %}
168
+ {%- if message['content'] is string %}
169
+ {{- message['content'] + eos_token }}
170
+ {%- else %}
171
+ {{- message['content'][0]['text'] + eos_token }}
172
+ {%- endif %}
173
+ {%- else %}
174
+ {{- raise_exception('Only user, system and assistant roles are supported!') }}
175
+ {%- endif %}
176
+ {%- endfor %}
config.json ADDED
@@ -0,0 +1,63 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "architectures": [
3
+ "Mistral3ForConditionalGeneration"
4
+ ],
5
+ "bos_token_id": 1,
6
+ "eos_token_id": 2,
7
+ "image_token_index": 10,
8
+ "model_type": "mistral3",
9
+ "multimodal_projector_bias": false,
10
+ "pad_token_id": 11,
11
+ "projector_hidden_act": "gelu",
12
+ "spatial_merge_size": 2,
13
+ "text_config": {
14
+ "attention_dropout": 0.0,
15
+ "head_dim": 128,
16
+ "hidden_act": "silu",
17
+ "hidden_size": 5120,
18
+ "initializer_range": 0.02,
19
+ "intermediate_size": 32768,
20
+ "max_position_embeddings": 131072,
21
+ "model_type": "mistral",
22
+ "num_attention_heads": 32,
23
+ "num_hidden_layers": 40,
24
+ "num_key_value_heads": 8,
25
+ "rms_norm_eps": 1e-05,
26
+ "rope_theta": 1000000000.0,
27
+ "sliding_window": null,
28
+ "torch_dtype": "bfloat16",
29
+ "use_cache": true,
30
+ "vocab_size": 131072
31
+ },
32
+ "torch_dtype": "bfloat16",
33
+ "transformers_version": "4.52.4",
34
+ "unsloth_fixed": true,
35
+ "vision_config": {
36
+ "attention_dropout": 0.0,
37
+ "head_dim": 64,
38
+ "hidden_act": "silu",
39
+ "hidden_size": 1024,
40
+ "image_size": 1540,
41
+ "initializer_range": 0.02,
42
+ "intermediate_size": 4096,
43
+ "model_type": "pixtral",
44
+ "num_attention_heads": 16,
45
+ "num_channels": 3,
46
+ "num_hidden_layers": 24,
47
+ "patch_size": 14,
48
+ "rope_theta": 10000.0,
49
+ "torch_dtype": "bfloat16"
50
+ },
51
+ "vision_feature_layer": -1,
52
+ "quantization_config": {
53
+ "quant_method": "exl3",
54
+ "version": "0.0.4",
55
+ "bits": 3.5,
56
+ "head_bits": 6,
57
+ "calibration": {
58
+ "rows": 100,
59
+ "cols": 2048
60
+ },
61
+ "out_scales": "auto"
62
+ }
63
+ }
generation_config.json ADDED
@@ -0,0 +1,7 @@
 
 
 
 
 
 
 
 
1
+ {
2
+ "_from_model_config": true,
3
+ "bos_token_id": 1,
4
+ "eos_token_id": 2,
5
+ "pad_token_id": 11,
6
+ "transformers_version": "4.52.4"
7
+ }
model-00001-of-00002.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:1755d1b4ba8ff1c99cef3d160b2deb798084df78b0b86cb23aa36b3f96992d28
3
+ size 8390451400
model-00002-of-00002.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:3fae851beacc0e470565bbaf4b57bbd906b7ef2ca46357171f4b167cb245efa3
3
+ size 4065401040
model.safetensors.index.json ADDED
The diff for this file is too large to render. See raw diff
 
params.json ADDED
@@ -0,0 +1,29 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "dim": 5120,
3
+ "n_layers": 40,
4
+ "head_dim": 128,
5
+ "hidden_dim": 32768,
6
+ "n_heads": 32,
7
+ "n_kv_heads": 8,
8
+ "rope_theta": 1000000000.0,
9
+ "norm_eps": 1e-05,
10
+ "vocab_size": 131072,
11
+ "vision_encoder": {
12
+ "hidden_size": 1024,
13
+ "num_channels": 3,
14
+ "max_image_size": 1540,
15
+ "patch_size": 14,
16
+ "rope_theta": 10000.0,
17
+ "intermediate_size": 4096,
18
+ "num_hidden_layers": 24,
19
+ "num_attention_heads": 16,
20
+ "adapter_bias": false,
21
+ "mm_projector_id": "patch_merge",
22
+ "spatial_merge_size": 2,
23
+ "add_pre_mm_projector_layer_norm": true,
24
+ "image_token_id": 10,
25
+ "image_break_token_id": 12,
26
+ "image_end_token_id": 13,
27
+ "image_size": 1540
28
+ }
29
+ }
preprocessor_config.json ADDED
@@ -0,0 +1,31 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "crop_size": null,
3
+ "data_format": "channels_first",
4
+ "default_to_square": true,
5
+ "device": null,
6
+ "do_center_crop": null,
7
+ "do_convert_rgb": true,
8
+ "do_normalize": true,
9
+ "do_rescale": true,
10
+ "do_resize": true,
11
+ "image_mean": [
12
+ 0.48145466,
13
+ 0.4578275,
14
+ 0.40821073
15
+ ],
16
+ "image_processor_type": "PixtralImageProcessorFast",
17
+ "image_std": [
18
+ 0.26862954,
19
+ 0.26130258,
20
+ 0.27577711
21
+ ],
22
+ "input_data_format": null,
23
+ "patch_size": 14,
24
+ "processor_class": "PixtralProcessor",
25
+ "resample": 3,
26
+ "rescale_factor": 0.00392156862745098,
27
+ "return_tensors": null,
28
+ "size": {
29
+ "longest_edge": 1540
30
+ }
31
+ }
processor_config.json ADDED
@@ -0,0 +1,8 @@
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "image_break_token": "[IMG_BREAK]",
3
+ "image_end_token": "[IMG_END]",
4
+ "image_token": "[IMG]",
5
+ "patch_size": 14,
6
+ "processor_class": "PixtralProcessor",
7
+ "spatial_merge_size": 2
8
+ }
quantization_config.json ADDED
The diff for this file is too large to render. See raw diff
 
special_tokens_map.json ADDED
@@ -0,0 +1,1032 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "additional_special_tokens": [
3
+ "<unk>",
4
+ "<s>",
5
+ "</s>",
6
+ "[INST]",
7
+ "[/INST]",
8
+ "[AVAILABLE_TOOLS]",
9
+ "[/AVAILABLE_TOOLS]",
10
+ "[TOOL_RESULTS]",
11
+ "[/TOOL_RESULTS]",
12
+ "[TOOL_CALLS]",
13
+ "[IMG]",
14
+ "<pad>",
15
+ "[IMG_BREAK]",
16
+ "[IMG_END]",
17
+ "[PREFIX]",
18
+ "[MIDDLE]",
19
+ "[SUFFIX]",
20
+ "[SYSTEM_PROMPT]",
21
+ "[/SYSTEM_PROMPT]",
22
+ "[TOOL_CONTENT]",
23
+ "<SPECIAL_20>",
24
+ "<SPECIAL_21>",
25
+ "<SPECIAL_22>",
26
+ "<SPECIAL_23>",
27
+ "<SPECIAL_24>",
28
+ "<SPECIAL_25>",
29
+ "<SPECIAL_26>",
30
+ "<SPECIAL_27>",
31
+ "<SPECIAL_28>",
32
+ "<SPECIAL_29>",
33
+ "<SPECIAL_30>",
34
+ "<SPECIAL_31>",
35
+ "[ARGS]",
36
+ "[CALL_ID]",
37
+ "<SPECIAL_34>",
38
+ "<SPECIAL_35>",
39
+ "<SPECIAL_36>",
40
+ "<SPECIAL_37>",
41
+ "<SPECIAL_38>",
42
+ "<SPECIAL_39>",
43
+ "<SPECIAL_40>",
44
+ "<SPECIAL_41>",
45
+ "<SPECIAL_42>",
46
+ "<SPECIAL_43>",
47
+ "<SPECIAL_44>",
48
+ "<SPECIAL_45>",
49
+ "<SPECIAL_46>",
50
+ "<SPECIAL_47>",
51
+ "<SPECIAL_48>",
52
+ "<SPECIAL_49>",
53
+ "<SPECIAL_50>",
54
+ "<SPECIAL_51>",
55
+ "<SPECIAL_52>",
56
+ "<SPECIAL_53>",
57
+ "<SPECIAL_54>",
58
+ "<SPECIAL_55>",
59
+ "<SPECIAL_56>",
60
+ "<SPECIAL_57>",
61
+ "<SPECIAL_58>",
62
+ "<SPECIAL_59>",
63
+ "<SPECIAL_60>",
64
+ "<SPECIAL_61>",
65
+ "<SPECIAL_62>",
66
+ "<SPECIAL_63>",
67
+ "<SPECIAL_64>",
68
+ "<SPECIAL_65>",
69
+ "<SPECIAL_66>",
70
+ "<SPECIAL_67>",
71
+ "<SPECIAL_68>",
72
+ "<SPECIAL_69>",
73
+ "<SPECIAL_70>",
74
+ "<SPECIAL_71>",
75
+ "<SPECIAL_72>",
76
+ "<SPECIAL_73>",
77
+ "<SPECIAL_74>",
78
+ "<SPECIAL_75>",
79
+ "<SPECIAL_76>",
80
+ "<SPECIAL_77>",
81
+ "<SPECIAL_78>",
82
+ "<SPECIAL_79>",
83
+ "<SPECIAL_80>",
84
+ "<SPECIAL_81>",
85
+ "<SPECIAL_82>",
86
+ "<SPECIAL_83>",
87
+ "<SPECIAL_84>",
88
+ "<SPECIAL_85>",
89
+ "<SPECIAL_86>",
90
+ "<SPECIAL_87>",
91
+ "<SPECIAL_88>",
92
+ "<SPECIAL_89>",
93
+ "<SPECIAL_90>",
94
+ "<SPECIAL_91>",
95
+ "<SPECIAL_92>",
96
+ "<SPECIAL_93>",
97
+ "<SPECIAL_94>",
98
+ "<SPECIAL_95>",
99
+ "<SPECIAL_96>",
100
+ "<SPECIAL_97>",
101
+ "<SPECIAL_98>",
102
+ "<SPECIAL_99>",
103
+ "<SPECIAL_100>",
104
+ "<SPECIAL_101>",
105
+ "<SPECIAL_102>",
106
+ "<SPECIAL_103>",
107
+ "<SPECIAL_104>",
108
+ "<SPECIAL_105>",
109
+ "<SPECIAL_106>",
110
+ "<SPECIAL_107>",
111
+ "<SPECIAL_108>",
112
+ "<SPECIAL_109>",
113
+ "<SPECIAL_110>",
114
+ "<SPECIAL_111>",
115
+ "<SPECIAL_112>",
116
+ "<SPECIAL_113>",
117
+ "<SPECIAL_114>",
118
+ "<SPECIAL_115>",
119
+ "<SPECIAL_116>",
120
+ "<SPECIAL_117>",
121
+ "<SPECIAL_118>",
122
+ "<SPECIAL_119>",
123
+ "<SPECIAL_120>",
124
+ "<SPECIAL_121>",
125
+ "<SPECIAL_122>",
126
+ "<SPECIAL_123>",
127
+ "<SPECIAL_124>",
128
+ "<SPECIAL_125>",
129
+ "<SPECIAL_126>",
130
+ "<SPECIAL_127>",
131
+ "<SPECIAL_128>",
132
+ "<SPECIAL_129>",
133
+ "<SPECIAL_130>",
134
+ "<SPECIAL_131>",
135
+ "<SPECIAL_132>",
136
+ "<SPECIAL_133>",
137
+ "<SPECIAL_134>",
138
+ "<SPECIAL_135>",
139
+ "<SPECIAL_136>",
140
+ "<SPECIAL_137>",
141
+ "<SPECIAL_138>",
142
+ "<SPECIAL_139>",
143
+ "<SPECIAL_140>",
144
+ "<SPECIAL_141>",
145
+ "<SPECIAL_142>",
146
+ "<SPECIAL_143>",
147
+ "<SPECIAL_144>",
148
+ "<SPECIAL_145>",
149
+ "<SPECIAL_146>",
150
+ "<SPECIAL_147>",
151
+ "<SPECIAL_148>",
152
+ "<SPECIAL_149>",
153
+ "<SPECIAL_150>",
154
+ "<SPECIAL_151>",
155
+ "<SPECIAL_152>",
156
+ "<SPECIAL_153>",
157
+ "<SPECIAL_154>",
158
+ "<SPECIAL_155>",
159
+ "<SPECIAL_156>",
160
+ "<SPECIAL_157>",
161
+ "<SPECIAL_158>",
162
+ "<SPECIAL_159>",
163
+ "<SPECIAL_160>",
164
+ "<SPECIAL_161>",
165
+ "<SPECIAL_162>",
166
+ "<SPECIAL_163>",
167
+ "<SPECIAL_164>",
168
+ "<SPECIAL_165>",
169
+ "<SPECIAL_166>",
170
+ "<SPECIAL_167>",
171
+ "<SPECIAL_168>",
172
+ "<SPECIAL_169>",
173
+ "<SPECIAL_170>",
174
+ "<SPECIAL_171>",
175
+ "<SPECIAL_172>",
176
+ "<SPECIAL_173>",
177
+ "<SPECIAL_174>",
178
+ "<SPECIAL_175>",
179
+ "<SPECIAL_176>",
180
+ "<SPECIAL_177>",
181
+ "<SPECIAL_178>",
182
+ "<SPECIAL_179>",
183
+ "<SPECIAL_180>",
184
+ "<SPECIAL_181>",
185
+ "<SPECIAL_182>",
186
+ "<SPECIAL_183>",
187
+ "<SPECIAL_184>",
188
+ "<SPECIAL_185>",
189
+ "<SPECIAL_186>",
190
+ "<SPECIAL_187>",
191
+ "<SPECIAL_188>",
192
+ "<SPECIAL_189>",
193
+ "<SPECIAL_190>",
194
+ "<SPECIAL_191>",
195
+ "<SPECIAL_192>",
196
+ "<SPECIAL_193>",
197
+ "<SPECIAL_194>",
198
+ "<SPECIAL_195>",
199
+ "<SPECIAL_196>",
200
+ "<SPECIAL_197>",
201
+ "<SPECIAL_198>",
202
+ "<SPECIAL_199>",
203
+ "<SPECIAL_200>",
204
+ "<SPECIAL_201>",
205
+ "<SPECIAL_202>",
206
+ "<SPECIAL_203>",
207
+ "<SPECIAL_204>",
208
+ "<SPECIAL_205>",
209
+ "<SPECIAL_206>",
210
+ "<SPECIAL_207>",
211
+ "<SPECIAL_208>",
212
+ "<SPECIAL_209>",
213
+ "<SPECIAL_210>",
214
+ "<SPECIAL_211>",
215
+ "<SPECIAL_212>",
216
+ "<SPECIAL_213>",
217
+ "<SPECIAL_214>",
218
+ "<SPECIAL_215>",
219
+ "<SPECIAL_216>",
220
+ "<SPECIAL_217>",
221
+ "<SPECIAL_218>",
222
+ "<SPECIAL_219>",
223
+ "<SPECIAL_220>",
224
+ "<SPECIAL_221>",
225
+ "<SPECIAL_222>",
226
+ "<SPECIAL_223>",
227
+ "<SPECIAL_224>",
228
+ "<SPECIAL_225>",
229
+ "<SPECIAL_226>",
230
+ "<SPECIAL_227>",
231
+ "<SPECIAL_228>",
232
+ "<SPECIAL_229>",
233
+ "<SPECIAL_230>",
234
+ "<SPECIAL_231>",
235
+ "<SPECIAL_232>",
236
+ "<SPECIAL_233>",
237
+ "<SPECIAL_234>",
238
+ "<SPECIAL_235>",
239
+ "<SPECIAL_236>",
240
+ "<SPECIAL_237>",
241
+ "<SPECIAL_238>",
242
+ "<SPECIAL_239>",
243
+ "<SPECIAL_240>",
244
+ "<SPECIAL_241>",
245
+ "<SPECIAL_242>",
246
+ "<SPECIAL_243>",
247
+ "<SPECIAL_244>",
248
+ "<SPECIAL_245>",
249
+ "<SPECIAL_246>",
250
+ "<SPECIAL_247>",
251
+ "<SPECIAL_248>",
252
+ "<SPECIAL_249>",
253
+ "<SPECIAL_250>",
254
+ "<SPECIAL_251>",
255
+ "<SPECIAL_252>",
256
+ "<SPECIAL_253>",
257
+ "<SPECIAL_254>",
258
+ "<SPECIAL_255>",
259
+ "<SPECIAL_256>",
260
+ "<SPECIAL_257>",
261
+ "<SPECIAL_258>",
262
+ "<SPECIAL_259>",
263
+ "<SPECIAL_260>",
264
+ "<SPECIAL_261>",
265
+ "<SPECIAL_262>",
266
+ "<SPECIAL_263>",
267
+ "<SPECIAL_264>",
268
+ "<SPECIAL_265>",
269
+ "<SPECIAL_266>",
270
+ "<SPECIAL_267>",
271
+ "<SPECIAL_268>",
272
+ "<SPECIAL_269>",
273
+ "<SPECIAL_270>",
274
+ "<SPECIAL_271>",
275
+ "<SPECIAL_272>",
276
+ "<SPECIAL_273>",
277
+ "<SPECIAL_274>",
278
+ "<SPECIAL_275>",
279
+ "<SPECIAL_276>",
280
+ "<SPECIAL_277>",
281
+ "<SPECIAL_278>",
282
+ "<SPECIAL_279>",
283
+ "<SPECIAL_280>",
284
+ "<SPECIAL_281>",
285
+ "<SPECIAL_282>",
286
+ "<SPECIAL_283>",
287
+ "<SPECIAL_284>",
288
+ "<SPECIAL_285>",
289
+ "<SPECIAL_286>",
290
+ "<SPECIAL_287>",
291
+ "<SPECIAL_288>",
292
+ "<SPECIAL_289>",
293
+ "<SPECIAL_290>",
294
+ "<SPECIAL_291>",
295
+ "<SPECIAL_292>",
296
+ "<SPECIAL_293>",
297
+ "<SPECIAL_294>",
298
+ "<SPECIAL_295>",
299
+ "<SPECIAL_296>",
300
+ "<SPECIAL_297>",
301
+ "<SPECIAL_298>",
302
+ "<SPECIAL_299>",
303
+ "<SPECIAL_300>",
304
+ "<SPECIAL_301>",
305
+ "<SPECIAL_302>",
306
+ "<SPECIAL_303>",
307
+ "<SPECIAL_304>",
308
+ "<SPECIAL_305>",
309
+ "<SPECIAL_306>",
310
+ "<SPECIAL_307>",
311
+ "<SPECIAL_308>",
312
+ "<SPECIAL_309>",
313
+ "<SPECIAL_310>",
314
+ "<SPECIAL_311>",
315
+ "<SPECIAL_312>",
316
+ "<SPECIAL_313>",
317
+ "<SPECIAL_314>",
318
+ "<SPECIAL_315>",
319
+ "<SPECIAL_316>",
320
+ "<SPECIAL_317>",
321
+ "<SPECIAL_318>",
322
+ "<SPECIAL_319>",
323
+ "<SPECIAL_320>",
324
+ "<SPECIAL_321>",
325
+ "<SPECIAL_322>",
326
+ "<SPECIAL_323>",
327
+ "<SPECIAL_324>",
328
+ "<SPECIAL_325>",
329
+ "<SPECIAL_326>",
330
+ "<SPECIAL_327>",
331
+ "<SPECIAL_328>",
332
+ "<SPECIAL_329>",
333
+ "<SPECIAL_330>",
334
+ "<SPECIAL_331>",
335
+ "<SPECIAL_332>",
336
+ "<SPECIAL_333>",
337
+ "<SPECIAL_334>",
338
+ "<SPECIAL_335>",
339
+ "<SPECIAL_336>",
340
+ "<SPECIAL_337>",
341
+ "<SPECIAL_338>",
342
+ "<SPECIAL_339>",
343
+ "<SPECIAL_340>",
344
+ "<SPECIAL_341>",
345
+ "<SPECIAL_342>",
346
+ "<SPECIAL_343>",
347
+ "<SPECIAL_344>",
348
+ "<SPECIAL_345>",
349
+ "<SPECIAL_346>",
350
+ "<SPECIAL_347>",
351
+ "<SPECIAL_348>",
352
+ "<SPECIAL_349>",
353
+ "<SPECIAL_350>",
354
+ "<SPECIAL_351>",
355
+ "<SPECIAL_352>",
356
+ "<SPECIAL_353>",
357
+ "<SPECIAL_354>",
358
+ "<SPECIAL_355>",
359
+ "<SPECIAL_356>",
360
+ "<SPECIAL_357>",
361
+ "<SPECIAL_358>",
362
+ "<SPECIAL_359>",
363
+ "<SPECIAL_360>",
364
+ "<SPECIAL_361>",
365
+ "<SPECIAL_362>",
366
+ "<SPECIAL_363>",
367
+ "<SPECIAL_364>",
368
+ "<SPECIAL_365>",
369
+ "<SPECIAL_366>",
370
+ "<SPECIAL_367>",
371
+ "<SPECIAL_368>",
372
+ "<SPECIAL_369>",
373
+ "<SPECIAL_370>",
374
+ "<SPECIAL_371>",
375
+ "<SPECIAL_372>",
376
+ "<SPECIAL_373>",
377
+ "<SPECIAL_374>",
378
+ "<SPECIAL_375>",
379
+ "<SPECIAL_376>",
380
+ "<SPECIAL_377>",
381
+ "<SPECIAL_378>",
382
+ "<SPECIAL_379>",
383
+ "<SPECIAL_380>",
384
+ "<SPECIAL_381>",
385
+ "<SPECIAL_382>",
386
+ "<SPECIAL_383>",
387
+ "<SPECIAL_384>",
388
+ "<SPECIAL_385>",
389
+ "<SPECIAL_386>",
390
+ "<SPECIAL_387>",
391
+ "<SPECIAL_388>",
392
+ "<SPECIAL_389>",
393
+ "<SPECIAL_390>",
394
+ "<SPECIAL_391>",
395
+ "<SPECIAL_392>",
396
+ "<SPECIAL_393>",
397
+ "<SPECIAL_394>",
398
+ "<SPECIAL_395>",
399
+ "<SPECIAL_396>",
400
+ "<SPECIAL_397>",
401
+ "<SPECIAL_398>",
402
+ "<SPECIAL_399>",
403
+ "<SPECIAL_400>",
404
+ "<SPECIAL_401>",
405
+ "<SPECIAL_402>",
406
+ "<SPECIAL_403>",
407
+ "<SPECIAL_404>",
408
+ "<SPECIAL_405>",
409
+ "<SPECIAL_406>",
410
+ "<SPECIAL_407>",
411
+ "<SPECIAL_408>",
412
+ "<SPECIAL_409>",
413
+ "<SPECIAL_410>",
414
+ "<SPECIAL_411>",
415
+ "<SPECIAL_412>",
416
+ "<SPECIAL_413>",
417
+ "<SPECIAL_414>",
418
+ "<SPECIAL_415>",
419
+ "<SPECIAL_416>",
420
+ "<SPECIAL_417>",
421
+ "<SPECIAL_418>",
422
+ "<SPECIAL_419>",
423
+ "<SPECIAL_420>",
424
+ "<SPECIAL_421>",
425
+ "<SPECIAL_422>",
426
+ "<SPECIAL_423>",
427
+ "<SPECIAL_424>",
428
+ "<SPECIAL_425>",
429
+ "<SPECIAL_426>",
430
+ "<SPECIAL_427>",
431
+ "<SPECIAL_428>",
432
+ "<SPECIAL_429>",
433
+ "<SPECIAL_430>",
434
+ "<SPECIAL_431>",
435
+ "<SPECIAL_432>",
436
+ "<SPECIAL_433>",
437
+ "<SPECIAL_434>",
438
+ "<SPECIAL_435>",
439
+ "<SPECIAL_436>",
440
+ "<SPECIAL_437>",
441
+ "<SPECIAL_438>",
442
+ "<SPECIAL_439>",
443
+ "<SPECIAL_440>",
444
+ "<SPECIAL_441>",
445
+ "<SPECIAL_442>",
446
+ "<SPECIAL_443>",
447
+ "<SPECIAL_444>",
448
+ "<SPECIAL_445>",
449
+ "<SPECIAL_446>",
450
+ "<SPECIAL_447>",
451
+ "<SPECIAL_448>",
452
+ "<SPECIAL_449>",
453
+ "<SPECIAL_450>",
454
+ "<SPECIAL_451>",
455
+ "<SPECIAL_452>",
456
+ "<SPECIAL_453>",
457
+ "<SPECIAL_454>",
458
+ "<SPECIAL_455>",
459
+ "<SPECIAL_456>",
460
+ "<SPECIAL_457>",
461
+ "<SPECIAL_458>",
462
+ "<SPECIAL_459>",
463
+ "<SPECIAL_460>",
464
+ "<SPECIAL_461>",
465
+ "<SPECIAL_462>",
466
+ "<SPECIAL_463>",
467
+ "<SPECIAL_464>",
468
+ "<SPECIAL_465>",
469
+ "<SPECIAL_466>",
470
+ "<SPECIAL_467>",
471
+ "<SPECIAL_468>",
472
+ "<SPECIAL_469>",
473
+ "<SPECIAL_470>",
474
+ "<SPECIAL_471>",
475
+ "<SPECIAL_472>",
476
+ "<SPECIAL_473>",
477
+ "<SPECIAL_474>",
478
+ "<SPECIAL_475>",
479
+ "<SPECIAL_476>",
480
+ "<SPECIAL_477>",
481
+ "<SPECIAL_478>",
482
+ "<SPECIAL_479>",
483
+ "<SPECIAL_480>",
484
+ "<SPECIAL_481>",
485
+ "<SPECIAL_482>",
486
+ "<SPECIAL_483>",
487
+ "<SPECIAL_484>",
488
+ "<SPECIAL_485>",
489
+ "<SPECIAL_486>",
490
+ "<SPECIAL_487>",
491
+ "<SPECIAL_488>",
492
+ "<SPECIAL_489>",
493
+ "<SPECIAL_490>",
494
+ "<SPECIAL_491>",
495
+ "<SPECIAL_492>",
496
+ "<SPECIAL_493>",
497
+ "<SPECIAL_494>",
498
+ "<SPECIAL_495>",
499
+ "<SPECIAL_496>",
500
+ "<SPECIAL_497>",
501
+ "<SPECIAL_498>",
502
+ "<SPECIAL_499>",
503
+ "<SPECIAL_500>",
504
+ "<SPECIAL_501>",
505
+ "<SPECIAL_502>",
506
+ "<SPECIAL_503>",
507
+ "<SPECIAL_504>",
508
+ "<SPECIAL_505>",
509
+ "<SPECIAL_506>",
510
+ "<SPECIAL_507>",
511
+ "<SPECIAL_508>",
512
+ "<SPECIAL_509>",
513
+ "<SPECIAL_510>",
514
+ "<SPECIAL_511>",
515
+ "<SPECIAL_512>",
516
+ "<SPECIAL_513>",
517
+ "<SPECIAL_514>",
518
+ "<SPECIAL_515>",
519
+ "<SPECIAL_516>",
520
+ "<SPECIAL_517>",
521
+ "<SPECIAL_518>",
522
+ "<SPECIAL_519>",
523
+ "<SPECIAL_520>",
524
+ "<SPECIAL_521>",
525
+ "<SPECIAL_522>",
526
+ "<SPECIAL_523>",
527
+ "<SPECIAL_524>",
528
+ "<SPECIAL_525>",
529
+ "<SPECIAL_526>",
530
+ "<SPECIAL_527>",
531
+ "<SPECIAL_528>",
532
+ "<SPECIAL_529>",
533
+ "<SPECIAL_530>",
534
+ "<SPECIAL_531>",
535
+ "<SPECIAL_532>",
536
+ "<SPECIAL_533>",
537
+ "<SPECIAL_534>",
538
+ "<SPECIAL_535>",
539
+ "<SPECIAL_536>",
540
+ "<SPECIAL_537>",
541
+ "<SPECIAL_538>",
542
+ "<SPECIAL_539>",
543
+ "<SPECIAL_540>",
544
+ "<SPECIAL_541>",
545
+ "<SPECIAL_542>",
546
+ "<SPECIAL_543>",
547
+ "<SPECIAL_544>",
548
+ "<SPECIAL_545>",
549
+ "<SPECIAL_546>",
550
+ "<SPECIAL_547>",
551
+ "<SPECIAL_548>",
552
+ "<SPECIAL_549>",
553
+ "<SPECIAL_550>",
554
+ "<SPECIAL_551>",
555
+ "<SPECIAL_552>",
556
+ "<SPECIAL_553>",
557
+ "<SPECIAL_554>",
558
+ "<SPECIAL_555>",
559
+ "<SPECIAL_556>",
560
+ "<SPECIAL_557>",
561
+ "<SPECIAL_558>",
562
+ "<SPECIAL_559>",
563
+ "<SPECIAL_560>",
564
+ "<SPECIAL_561>",
565
+ "<SPECIAL_562>",
566
+ "<SPECIAL_563>",
567
+ "<SPECIAL_564>",
568
+ "<SPECIAL_565>",
569
+ "<SPECIAL_566>",
570
+ "<SPECIAL_567>",
571
+ "<SPECIAL_568>",
572
+ "<SPECIAL_569>",
573
+ "<SPECIAL_570>",
574
+ "<SPECIAL_571>",
575
+ "<SPECIAL_572>",
576
+ "<SPECIAL_573>",
577
+ "<SPECIAL_574>",
578
+ "<SPECIAL_575>",
579
+ "<SPECIAL_576>",
580
+ "<SPECIAL_577>",
581
+ "<SPECIAL_578>",
582
+ "<SPECIAL_579>",
583
+ "<SPECIAL_580>",
584
+ "<SPECIAL_581>",
585
+ "<SPECIAL_582>",
586
+ "<SPECIAL_583>",
587
+ "<SPECIAL_584>",
588
+ "<SPECIAL_585>",
589
+ "<SPECIAL_586>",
590
+ "<SPECIAL_587>",
591
+ "<SPECIAL_588>",
592
+ "<SPECIAL_589>",
593
+ "<SPECIAL_590>",
594
+ "<SPECIAL_591>",
595
+ "<SPECIAL_592>",
596
+ "<SPECIAL_593>",
597
+ "<SPECIAL_594>",
598
+ "<SPECIAL_595>",
599
+ "<SPECIAL_596>",
600
+ "<SPECIAL_597>",
601
+ "<SPECIAL_598>",
602
+ "<SPECIAL_599>",
603
+ "<SPECIAL_600>",
604
+ "<SPECIAL_601>",
605
+ "<SPECIAL_602>",
606
+ "<SPECIAL_603>",
607
+ "<SPECIAL_604>",
608
+ "<SPECIAL_605>",
609
+ "<SPECIAL_606>",
610
+ "<SPECIAL_607>",
611
+ "<SPECIAL_608>",
612
+ "<SPECIAL_609>",
613
+ "<SPECIAL_610>",
614
+ "<SPECIAL_611>",
615
+ "<SPECIAL_612>",
616
+ "<SPECIAL_613>",
617
+ "<SPECIAL_614>",
618
+ "<SPECIAL_615>",
619
+ "<SPECIAL_616>",
620
+ "<SPECIAL_617>",
621
+ "<SPECIAL_618>",
622
+ "<SPECIAL_619>",
623
+ "<SPECIAL_620>",
624
+ "<SPECIAL_621>",
625
+ "<SPECIAL_622>",
626
+ "<SPECIAL_623>",
627
+ "<SPECIAL_624>",
628
+ "<SPECIAL_625>",
629
+ "<SPECIAL_626>",
630
+ "<SPECIAL_627>",
631
+ "<SPECIAL_628>",
632
+ "<SPECIAL_629>",
633
+ "<SPECIAL_630>",
634
+ "<SPECIAL_631>",
635
+ "<SPECIAL_632>",
636
+ "<SPECIAL_633>",
637
+ "<SPECIAL_634>",
638
+ "<SPECIAL_635>",
639
+ "<SPECIAL_636>",
640
+ "<SPECIAL_637>",
641
+ "<SPECIAL_638>",
642
+ "<SPECIAL_639>",
643
+ "<SPECIAL_640>",
644
+ "<SPECIAL_641>",
645
+ "<SPECIAL_642>",
646
+ "<SPECIAL_643>",
647
+ "<SPECIAL_644>",
648
+ "<SPECIAL_645>",
649
+ "<SPECIAL_646>",
650
+ "<SPECIAL_647>",
651
+ "<SPECIAL_648>",
652
+ "<SPECIAL_649>",
653
+ "<SPECIAL_650>",
654
+ "<SPECIAL_651>",
655
+ "<SPECIAL_652>",
656
+ "<SPECIAL_653>",
657
+ "<SPECIAL_654>",
658
+ "<SPECIAL_655>",
659
+ "<SPECIAL_656>",
660
+ "<SPECIAL_657>",
661
+ "<SPECIAL_658>",
662
+ "<SPECIAL_659>",
663
+ "<SPECIAL_660>",
664
+ "<SPECIAL_661>",
665
+ "<SPECIAL_662>",
666
+ "<SPECIAL_663>",
667
+ "<SPECIAL_664>",
668
+ "<SPECIAL_665>",
669
+ "<SPECIAL_666>",
670
+ "<SPECIAL_667>",
671
+ "<SPECIAL_668>",
672
+ "<SPECIAL_669>",
673
+ "<SPECIAL_670>",
674
+ "<SPECIAL_671>",
675
+ "<SPECIAL_672>",
676
+ "<SPECIAL_673>",
677
+ "<SPECIAL_674>",
678
+ "<SPECIAL_675>",
679
+ "<SPECIAL_676>",
680
+ "<SPECIAL_677>",
681
+ "<SPECIAL_678>",
682
+ "<SPECIAL_679>",
683
+ "<SPECIAL_680>",
684
+ "<SPECIAL_681>",
685
+ "<SPECIAL_682>",
686
+ "<SPECIAL_683>",
687
+ "<SPECIAL_684>",
688
+ "<SPECIAL_685>",
689
+ "<SPECIAL_686>",
690
+ "<SPECIAL_687>",
691
+ "<SPECIAL_688>",
692
+ "<SPECIAL_689>",
693
+ "<SPECIAL_690>",
694
+ "<SPECIAL_691>",
695
+ "<SPECIAL_692>",
696
+ "<SPECIAL_693>",
697
+ "<SPECIAL_694>",
698
+ "<SPECIAL_695>",
699
+ "<SPECIAL_696>",
700
+ "<SPECIAL_697>",
701
+ "<SPECIAL_698>",
702
+ "<SPECIAL_699>",
703
+ "<SPECIAL_700>",
704
+ "<SPECIAL_701>",
705
+ "<SPECIAL_702>",
706
+ "<SPECIAL_703>",
707
+ "<SPECIAL_704>",
708
+ "<SPECIAL_705>",
709
+ "<SPECIAL_706>",
710
+ "<SPECIAL_707>",
711
+ "<SPECIAL_708>",
712
+ "<SPECIAL_709>",
713
+ "<SPECIAL_710>",
714
+ "<SPECIAL_711>",
715
+ "<SPECIAL_712>",
716
+ "<SPECIAL_713>",
717
+ "<SPECIAL_714>",
718
+ "<SPECIAL_715>",
719
+ "<SPECIAL_716>",
720
+ "<SPECIAL_717>",
721
+ "<SPECIAL_718>",
722
+ "<SPECIAL_719>",
723
+ "<SPECIAL_720>",
724
+ "<SPECIAL_721>",
725
+ "<SPECIAL_722>",
726
+ "<SPECIAL_723>",
727
+ "<SPECIAL_724>",
728
+ "<SPECIAL_725>",
729
+ "<SPECIAL_726>",
730
+ "<SPECIAL_727>",
731
+ "<SPECIAL_728>",
732
+ "<SPECIAL_729>",
733
+ "<SPECIAL_730>",
734
+ "<SPECIAL_731>",
735
+ "<SPECIAL_732>",
736
+ "<SPECIAL_733>",
737
+ "<SPECIAL_734>",
738
+ "<SPECIAL_735>",
739
+ "<SPECIAL_736>",
740
+ "<SPECIAL_737>",
741
+ "<SPECIAL_738>",
742
+ "<SPECIAL_739>",
743
+ "<SPECIAL_740>",
744
+ "<SPECIAL_741>",
745
+ "<SPECIAL_742>",
746
+ "<SPECIAL_743>",
747
+ "<SPECIAL_744>",
748
+ "<SPECIAL_745>",
749
+ "<SPECIAL_746>",
750
+ "<SPECIAL_747>",
751
+ "<SPECIAL_748>",
752
+ "<SPECIAL_749>",
753
+ "<SPECIAL_750>",
754
+ "<SPECIAL_751>",
755
+ "<SPECIAL_752>",
756
+ "<SPECIAL_753>",
757
+ "<SPECIAL_754>",
758
+ "<SPECIAL_755>",
759
+ "<SPECIAL_756>",
760
+ "<SPECIAL_757>",
761
+ "<SPECIAL_758>",
762
+ "<SPECIAL_759>",
763
+ "<SPECIAL_760>",
764
+ "<SPECIAL_761>",
765
+ "<SPECIAL_762>",
766
+ "<SPECIAL_763>",
767
+ "<SPECIAL_764>",
768
+ "<SPECIAL_765>",
769
+ "<SPECIAL_766>",
770
+ "<SPECIAL_767>",
771
+ "<SPECIAL_768>",
772
+ "<SPECIAL_769>",
773
+ "<SPECIAL_770>",
774
+ "<SPECIAL_771>",
775
+ "<SPECIAL_772>",
776
+ "<SPECIAL_773>",
777
+ "<SPECIAL_774>",
778
+ "<SPECIAL_775>",
779
+ "<SPECIAL_776>",
780
+ "<SPECIAL_777>",
781
+ "<SPECIAL_778>",
782
+ "<SPECIAL_779>",
783
+ "<SPECIAL_780>",
784
+ "<SPECIAL_781>",
785
+ "<SPECIAL_782>",
786
+ "<SPECIAL_783>",
787
+ "<SPECIAL_784>",
788
+ "<SPECIAL_785>",
789
+ "<SPECIAL_786>",
790
+ "<SPECIAL_787>",
791
+ "<SPECIAL_788>",
792
+ "<SPECIAL_789>",
793
+ "<SPECIAL_790>",
794
+ "<SPECIAL_791>",
795
+ "<SPECIAL_792>",
796
+ "<SPECIAL_793>",
797
+ "<SPECIAL_794>",
798
+ "<SPECIAL_795>",
799
+ "<SPECIAL_796>",
800
+ "<SPECIAL_797>",
801
+ "<SPECIAL_798>",
802
+ "<SPECIAL_799>",
803
+ "<SPECIAL_800>",
804
+ "<SPECIAL_801>",
805
+ "<SPECIAL_802>",
806
+ "<SPECIAL_803>",
807
+ "<SPECIAL_804>",
808
+ "<SPECIAL_805>",
809
+ "<SPECIAL_806>",
810
+ "<SPECIAL_807>",
811
+ "<SPECIAL_808>",
812
+ "<SPECIAL_809>",
813
+ "<SPECIAL_810>",
814
+ "<SPECIAL_811>",
815
+ "<SPECIAL_812>",
816
+ "<SPECIAL_813>",
817
+ "<SPECIAL_814>",
818
+ "<SPECIAL_815>",
819
+ "<SPECIAL_816>",
820
+ "<SPECIAL_817>",
821
+ "<SPECIAL_818>",
822
+ "<SPECIAL_819>",
823
+ "<SPECIAL_820>",
824
+ "<SPECIAL_821>",
825
+ "<SPECIAL_822>",
826
+ "<SPECIAL_823>",
827
+ "<SPECIAL_824>",
828
+ "<SPECIAL_825>",
829
+ "<SPECIAL_826>",
830
+ "<SPECIAL_827>",
831
+ "<SPECIAL_828>",
832
+ "<SPECIAL_829>",
833
+ "<SPECIAL_830>",
834
+ "<SPECIAL_831>",
835
+ "<SPECIAL_832>",
836
+ "<SPECIAL_833>",
837
+ "<SPECIAL_834>",
838
+ "<SPECIAL_835>",
839
+ "<SPECIAL_836>",
840
+ "<SPECIAL_837>",
841
+ "<SPECIAL_838>",
842
+ "<SPECIAL_839>",
843
+ "<SPECIAL_840>",
844
+ "<SPECIAL_841>",
845
+ "<SPECIAL_842>",
846
+ "<SPECIAL_843>",
847
+ "<SPECIAL_844>",
848
+ "<SPECIAL_845>",
849
+ "<SPECIAL_846>",
850
+ "<SPECIAL_847>",
851
+ "<SPECIAL_848>",
852
+ "<SPECIAL_849>",
853
+ "<SPECIAL_850>",
854
+ "<SPECIAL_851>",
855
+ "<SPECIAL_852>",
856
+ "<SPECIAL_853>",
857
+ "<SPECIAL_854>",
858
+ "<SPECIAL_855>",
859
+ "<SPECIAL_856>",
860
+ "<SPECIAL_857>",
861
+ "<SPECIAL_858>",
862
+ "<SPECIAL_859>",
863
+ "<SPECIAL_860>",
864
+ "<SPECIAL_861>",
865
+ "<SPECIAL_862>",
866
+ "<SPECIAL_863>",
867
+ "<SPECIAL_864>",
868
+ "<SPECIAL_865>",
869
+ "<SPECIAL_866>",
870
+ "<SPECIAL_867>",
871
+ "<SPECIAL_868>",
872
+ "<SPECIAL_869>",
873
+ "<SPECIAL_870>",
874
+ "<SPECIAL_871>",
875
+ "<SPECIAL_872>",
876
+ "<SPECIAL_873>",
877
+ "<SPECIAL_874>",
878
+ "<SPECIAL_875>",
879
+ "<SPECIAL_876>",
880
+ "<SPECIAL_877>",
881
+ "<SPECIAL_878>",
882
+ "<SPECIAL_879>",
883
+ "<SPECIAL_880>",
884
+ "<SPECIAL_881>",
885
+ "<SPECIAL_882>",
886
+ "<SPECIAL_883>",
887
+ "<SPECIAL_884>",
888
+ "<SPECIAL_885>",
889
+ "<SPECIAL_886>",
890
+ "<SPECIAL_887>",
891
+ "<SPECIAL_888>",
892
+ "<SPECIAL_889>",
893
+ "<SPECIAL_890>",
894
+ "<SPECIAL_891>",
895
+ "<SPECIAL_892>",
896
+ "<SPECIAL_893>",
897
+ "<SPECIAL_894>",
898
+ "<SPECIAL_895>",
899
+ "<SPECIAL_896>",
900
+ "<SPECIAL_897>",
901
+ "<SPECIAL_898>",
902
+ "<SPECIAL_899>",
903
+ "<SPECIAL_900>",
904
+ "<SPECIAL_901>",
905
+ "<SPECIAL_902>",
906
+ "<SPECIAL_903>",
907
+ "<SPECIAL_904>",
908
+ "<SPECIAL_905>",
909
+ "<SPECIAL_906>",
910
+ "<SPECIAL_907>",
911
+ "<SPECIAL_908>",
912
+ "<SPECIAL_909>",
913
+ "<SPECIAL_910>",
914
+ "<SPECIAL_911>",
915
+ "<SPECIAL_912>",
916
+ "<SPECIAL_913>",
917
+ "<SPECIAL_914>",
918
+ "<SPECIAL_915>",
919
+ "<SPECIAL_916>",
920
+ "<SPECIAL_917>",
921
+ "<SPECIAL_918>",
922
+ "<SPECIAL_919>",
923
+ "<SPECIAL_920>",
924
+ "<SPECIAL_921>",
925
+ "<SPECIAL_922>",
926
+ "<SPECIAL_923>",
927
+ "<SPECIAL_924>",
928
+ "<SPECIAL_925>",
929
+ "<SPECIAL_926>",
930
+ "<SPECIAL_927>",
931
+ "<SPECIAL_928>",
932
+ "<SPECIAL_929>",
933
+ "<SPECIAL_930>",
934
+ "<SPECIAL_931>",
935
+ "<SPECIAL_932>",
936
+ "<SPECIAL_933>",
937
+ "<SPECIAL_934>",
938
+ "<SPECIAL_935>",
939
+ "<SPECIAL_936>",
940
+ "<SPECIAL_937>",
941
+ "<SPECIAL_938>",
942
+ "<SPECIAL_939>",
943
+ "<SPECIAL_940>",
944
+ "<SPECIAL_941>",
945
+ "<SPECIAL_942>",
946
+ "<SPECIAL_943>",
947
+ "<SPECIAL_944>",
948
+ "<SPECIAL_945>",
949
+ "<SPECIAL_946>",
950
+ "<SPECIAL_947>",
951
+ "<SPECIAL_948>",
952
+ "<SPECIAL_949>",
953
+ "<SPECIAL_950>",
954
+ "<SPECIAL_951>",
955
+ "<SPECIAL_952>",
956
+ "<SPECIAL_953>",
957
+ "<SPECIAL_954>",
958
+ "<SPECIAL_955>",
959
+ "<SPECIAL_956>",
960
+ "<SPECIAL_957>",
961
+ "<SPECIAL_958>",
962
+ "<SPECIAL_959>",
963
+ "<SPECIAL_960>",
964
+ "<SPECIAL_961>",
965
+ "<SPECIAL_962>",
966
+ "<SPECIAL_963>",
967
+ "<SPECIAL_964>",
968
+ "<SPECIAL_965>",
969
+ "<SPECIAL_966>",
970
+ "<SPECIAL_967>",
971
+ "<SPECIAL_968>",
972
+ "<SPECIAL_969>",
973
+ "<SPECIAL_970>",
974
+ "<SPECIAL_971>",
975
+ "<SPECIAL_972>",
976
+ "<SPECIAL_973>",
977
+ "<SPECIAL_974>",
978
+ "<SPECIAL_975>",
979
+ "<SPECIAL_976>",
980
+ "<SPECIAL_977>",
981
+ "<SPECIAL_978>",
982
+ "<SPECIAL_979>",
983
+ "<SPECIAL_980>",
984
+ "<SPECIAL_981>",
985
+ "<SPECIAL_982>",
986
+ "<SPECIAL_983>",
987
+ "<SPECIAL_984>",
988
+ "<SPECIAL_985>",
989
+ "<SPECIAL_986>",
990
+ "<SPECIAL_987>",
991
+ "<SPECIAL_988>",
992
+ "<SPECIAL_989>",
993
+ "<SPECIAL_990>",
994
+ "<SPECIAL_991>",
995
+ "<SPECIAL_992>",
996
+ "<SPECIAL_993>",
997
+ "<SPECIAL_994>",
998
+ "<SPECIAL_995>",
999
+ "<SPECIAL_996>",
1000
+ "<SPECIAL_997>",
1001
+ "<SPECIAL_998>",
1002
+ "<SPECIAL_999>"
1003
+ ],
1004
+ "bos_token": {
1005
+ "content": "<s>",
1006
+ "lstrip": false,
1007
+ "normalized": false,
1008
+ "rstrip": false,
1009
+ "single_word": false
1010
+ },
1011
+ "eos_token": {
1012
+ "content": "</s>",
1013
+ "lstrip": false,
1014
+ "normalized": false,
1015
+ "rstrip": false,
1016
+ "single_word": false
1017
+ },
1018
+ "pad_token": {
1019
+ "content": "<pad>",
1020
+ "lstrip": false,
1021
+ "normalized": false,
1022
+ "rstrip": false,
1023
+ "single_word": false
1024
+ },
1025
+ "unk_token": {
1026
+ "content": "<unk>",
1027
+ "lstrip": false,
1028
+ "normalized": false,
1029
+ "rstrip": false,
1030
+ "single_word": false
1031
+ }
1032
+ }
tekken.json ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:6e2501687ccd0e1f30f36319eaf2b46958b897811e246cd8eb5d385b9e3de7d1
3
+ size 19399895
tokenizer.json ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:23ad081f384b2bdb3c97f6e5461dfce52c1174c7328854a55006988f0fef9da7
3
+ size 17078019
tokenizer_config.json ADDED
The diff for this file is too large to render. See raw diff