|
/home/floriadmin/miniforge3/envs/mlc/bin/python -m mlc_llm gen_config ../dist/models/XAgentLlama-7B-preview --quantization q8f32_1 --conv-template llama-2 --output /tmp/tmpyuo2_jf8 |
|
[2024-03-18 20:20:37] INFO auto_config.py:115: [92mFound[0m model configuration: ../dist/models/XAgentLlama-7B-preview/config.json |
|
[2024-03-18 20:20:37] INFO auto_config.py:153: [92mFound[0m model type: [1mllama[0m. Use `--model-type` to override. |
|
[2024-03-18 20:20:37] INFO llama_model.py:52: [1mcontext_window_size[0m not found in config.json. Falling back to [1mmax_position_embeddings[0m (16384) |
|
[2024-03-18 20:20:37] INFO llama_model.py:72: [1mprefill_chunk_size[0m defaults to [1mcontext_window_size[0m (16384) |
|
[2024-03-18 20:20:37] INFO config.py:106: Overriding [1mmax_batch_size[0m from 1 to 80 |
|
[2024-03-18 20:20:37] INFO gen_config.py:133: [generation_config.json] Setting [1mbos_token_id[0m: 1 |
|
[2024-03-18 20:20:37] INFO gen_config.py:133: [generation_config.json] Setting [1meos_token_id[0m: 2 |
|
[2024-03-18 20:20:37] INFO gen_config.py:145: [92mFound[0m tokenizer config: ../dist/models/XAgentLlama-7B-preview/tokenizer.model. Copying to [1m/tmp/tmpyuo2_jf8/tokenizer.model[0m |
|
[2024-03-18 20:20:37] INFO gen_config.py:145: [92mFound[0m tokenizer config: ../dist/models/XAgentLlama-7B-preview/tokenizer.json. Copying to [1m/tmp/tmpyuo2_jf8/tokenizer.json[0m |
|
[2024-03-18 20:20:37] INFO gen_config.py:147: [91mNot found[0m tokenizer config: ../dist/models/XAgentLlama-7B-preview/vocab.json |
|
[2024-03-18 20:20:37] INFO gen_config.py:147: [91mNot found[0m tokenizer config: ../dist/models/XAgentLlama-7B-preview/merges.txt |
|
[2024-03-18 20:20:37] INFO gen_config.py:147: [91mNot found[0m tokenizer config: ../dist/models/XAgentLlama-7B-preview/added_tokens.json |
|
[2024-03-18 20:20:37] INFO gen_config.py:145: [92mFound[0m tokenizer config: ../dist/models/XAgentLlama-7B-preview/tokenizer_config.json. Copying to [1m/tmp/tmpyuo2_jf8/tokenizer_config.json[0m |
|
[2024-03-18 20:20:37] INFO gen_config.py:75: [System default] Setting [1mpad_token_id[0m: 0 |
|
[2024-03-18 20:20:37] INFO gen_config.py:75: [System default] Setting [1mtemperature[0m: 0.7 |
|
[2024-03-18 20:20:37] INFO gen_config.py:75: [System default] Setting [1mpresence_penalty[0m: 0.0 |
|
[2024-03-18 20:20:37] INFO gen_config.py:75: [System default] Setting [1mfrequency_penalty[0m: 0.0 |
|
[2024-03-18 20:20:37] INFO gen_config.py:75: [System default] Setting [1mrepetition_penalty[0m: 1.0 |
|
[2024-03-18 20:20:37] INFO gen_config.py:75: [System default] Setting [1mtop_p[0m: 0.95 |
|
[2024-03-18 20:20:37] INFO gen_config.py:75: [System default] Setting [1mmean_gen_len[0m: 128 |
|
[2024-03-18 20:20:37] INFO gen_config.py:75: [System default] Setting [1mmax_gen_len[0m: 512 |
|
[2024-03-18 20:20:37] INFO gen_config.py:75: [System default] Setting [1mshift_fill_factor[0m: 0.3 |
|
[2024-03-18 20:20:37] INFO gen_config.py:198: Dumping configuration file to: [1m/tmp/tmpyuo2_jf8/mlc-chat-config.json[0m |
|
/home/floriadmin/miniforge3/envs/mlc/bin/python -m mlc_llm convert_weight ../dist/models/XAgentLlama-7B-preview --quantization q8f32_1 --source-format auto --output /tmp/tmpyuo2_jf8 |
|
[2024-03-18 20:20:38] INFO auto_config.py:115: [92mFound[0m model configuration: ../dist/models/XAgentLlama-7B-preview/config.json |
|
[2024-03-18 20:20:39] INFO auto_device.py:76: [92mFound[0m device: cuda:0 |
|
[2024-03-18 20:20:39] INFO auto_device.py:76: [92mFound[0m device: cuda:1 |
|
[2024-03-18 20:20:39] INFO auto_device.py:76: [92mFound[0m device: cuda:2 |
|
[2024-03-18 20:20:39] INFO auto_device.py:76: [92mFound[0m device: cuda:3 |
|
[2024-03-18 20:20:39] INFO auto_device.py:76: [92mFound[0m device: cuda:4 |
|
[2024-03-18 20:20:39] INFO auto_device.py:76: [92mFound[0m device: cuda:5 |
|
[2024-03-18 20:20:39] INFO auto_device.py:76: [92mFound[0m device: cuda:6 |
|
[2024-03-18 20:20:39] INFO auto_device.py:76: [92mFound[0m device: cuda:7 |
|
[2024-03-18 20:20:39] INFO auto_device.py:76: [92mFound[0m device: cuda:8 |
|
[2024-03-18 20:20:39] INFO auto_device.py:76: [92mFound[0m device: cuda:9 |
|
[2024-03-18 20:20:40] INFO auto_device.py:85: [91mNot found[0m device: rocm:0 |
|
[2024-03-18 20:20:41] INFO auto_device.py:85: [91mNot found[0m device: metal:0 |
|
[2024-03-18 20:20:44] INFO auto_device.py:76: [92mFound[0m device: vulkan:0 |
|
[2024-03-18 20:20:44] INFO auto_device.py:76: [92mFound[0m device: vulkan:1 |
|
[2024-03-18 20:20:44] INFO auto_device.py:76: [92mFound[0m device: vulkan:2 |
|
[2024-03-18 20:20:44] INFO auto_device.py:76: [92mFound[0m device: vulkan:3 |
|
[2024-03-18 20:20:44] INFO auto_device.py:76: [92mFound[0m device: vulkan:4 |
|
[2024-03-18 20:20:44] INFO auto_device.py:76: [92mFound[0m device: vulkan:5 |
|
[2024-03-18 20:20:44] INFO auto_device.py:76: [92mFound[0m device: vulkan:6 |
|
[2024-03-18 20:20:44] INFO auto_device.py:76: [92mFound[0m device: vulkan:7 |
|
[2024-03-18 20:20:44] INFO auto_device.py:76: [92mFound[0m device: vulkan:8 |
|
[2024-03-18 20:20:44] INFO auto_device.py:76: [92mFound[0m device: vulkan:9 |
|
[2024-03-18 20:20:44] INFO auto_device.py:76: [92mFound[0m device: vulkan:10 |
|
[2024-03-18 20:20:45] INFO auto_device.py:85: [91mNot found[0m device: opencl:0 |
|
[2024-03-18 20:20:45] INFO auto_device.py:33: Using device: [1mcuda:0[0m |
|
[2024-03-18 20:20:45] INFO auto_weight.py:70: Finding weights in: ../dist/models/XAgentLlama-7B-preview |
|
[2024-03-18 20:20:45] INFO auto_weight.py:129: [92mFound[0m source weight format: huggingface-torch. Source configuration: ../dist/models/XAgentLlama-7B-preview/pytorch_model.bin |
|
[2024-03-18 20:20:45] INFO auto_weight.py:143: [92mFound[0m source weight format: huggingface-safetensor. Source configuration: ../dist/models/XAgentLlama-7B-preview/model.safetensors.index.json |
|
[2024-03-18 20:20:45] INFO auto_weight.py:106: Using source weight configuration: [1m../dist/models/XAgentLlama-7B-preview/pytorch_model.bin[0m. Use `--source` to override. |
|
[2024-03-18 20:20:45] INFO auto_weight.py:110: Using source weight format: [1mhuggingface-torch[0m. Use `--source-format` to override. |
|
[2024-03-18 20:20:45] INFO auto_config.py:153: [92mFound[0m model type: [1mllama[0m. Use `--model-type` to override. |
|
[2024-03-18 20:20:45] INFO llama_model.py:52: [1mcontext_window_size[0m not found in config.json. Falling back to [1mmax_position_embeddings[0m (16384) |
|
[2024-03-18 20:20:45] INFO llama_model.py:72: [1mprefill_chunk_size[0m defaults to [1mcontext_window_size[0m (16384) |
|
[2024-03-18 20:20:48] INFO huggingface_loader.py:182: Loading HF parameters from: ../dist/models/XAgentLlama-7B-preview/pytorch_model.bin |
|
[1mWeight conversion with arguments:[0m |
|
[1m--config[0m ../dist/models/XAgentLlama-7B-preview/config.json |
|
[1m--quantization[0m GroupQuantize(name='q8f32_1', kind='group-quant', group_size=32, quantize_dtype='int8', storage_dtype='uint32', model_dtype='float32', linear_weight_layout='NK', quantize_embedding=True, quantize_final_fc=True, num_elem_per_storage=4, num_storage_per_group=8, max_int_value=127) |
|
[1m--model-type[0m llama |
|
[1m--device[0m cuda:0 |
|
[1m--source[0m ../dist/models/XAgentLlama-7B-preview/pytorch_model.bin |
|
[1m--source-format[0m huggingface-torch |
|
[1m--output[0m /tmp/tmpyuo2_jf8 |
|
Start storing to cache /tmp/tmpyuo2_jf8 |
|
0%| | 0/195 [00:00<?, ?it/s]
[2024-03-18 20:21:18] INFO group_quantization.py:232: Compiling quantize function for key: ((32016, 4096), float32, cuda, axis=1, output_transpose=False) |
|
0%| | 0/195 [00:02<?, ?it/s]
[2024-03-18 20:21:19] INFO huggingface_loader.py:164: [Quantized] Parameter: "[1mmodel.embed_tokens.q_weight[0m", shape: (32016, 1024), dtype: uint32 |
|
0%| | 0/195 [00:03<?, ?it/s]
[2024-03-18 20:21:20] INFO huggingface_loader.py:164: [Quantized] Parameter: "[1mmodel.embed_tokens.q_scale[0m", shape: (32016, 128), dtype: float32 |
|
0%| | 0/195 [00:03<?, ?it/s]/home/floriadmin/miniforge3/envs/mlc/lib/python3.11/site-packages/numpy/core/getlimits.py:549: UserWarning: The value of the smallest subnormal for <class 'numpy.float32'> type is zero. |
|
setattr(self, word, getattr(machar, word).flat[0]) |
|
/home/floriadmin/miniforge3/envs/mlc/lib/python3.11/site-packages/numpy/core/getlimits.py:89: UserWarning: The value of the smallest subnormal for <class 'numpy.float32'> type is zero. |
|
return self._float_to_str(self.smallest_subnormal) |
|
1%|β | 1/195 [00:03<12:41, 3.93s/it]
[2024-03-18 20:21:20] INFO huggingface_loader.py:172: [Not quantized] Parameter: "[1mmodel.norm.weight[0m", shape: (4096,), dtype: float32 |
|
1%|β | 1/195 [00:03<12:41, 3.93s/it]
[2024-03-18 20:21:22] INFO huggingface_loader.py:164: [Quantized] Parameter: "[1mlm_head.q_weight[0m", shape: (32016, 1024), dtype: uint32 |
|
1%|β | 1/195 [00:05<12:41, 3.93s/it]
[2024-03-18 20:21:22] INFO huggingface_loader.py:164: [Quantized] Parameter: "[1mlm_head.q_scale[0m", shape: (32016, 128), dtype: float32 |
|
1%|β | 1/195 [00:06<12:41, 3.93s/it]
2%|ββ | 3/195 [00:06<06:30, 2.04s/it]
[2024-03-18 20:21:23] INFO huggingface_loader.py:172: [Not quantized] Parameter: "[1mmodel.layers.0.input_layernorm.weight[0m", shape: (4096,), dtype: float32 |
|
2%|ββ | 3/195 [00:06<06:30, 2.04s/it]
[2024-03-18 20:21:23] INFO group_quantization.py:232: Compiling quantize function for key: ((12288, 4096), float32, cuda, axis=1, output_transpose=False) |
|
2%|ββ | 3/195 [00:07<06:30, 2.04s/it]
[2024-03-18 20:21:24] INFO huggingface_loader.py:164: [Quantized] Parameter: "[1mmodel.layers.0.self_attn.qkv_proj.q_weight[0m", shape: (12288, 1024), dtype: uint32 |
|
2%|ββ | 3/195 [00:07<06:30, 2.04s/it]
[2024-03-18 20:21:24] INFO huggingface_loader.py:164: [Quantized] Parameter: "[1mmodel.layers.0.self_attn.qkv_proj.q_scale[0m", shape: (12288, 128), dtype: float32 |
|
2%|ββ | 3/195 [00:08<06:30, 2.04s/it]
3%|βββ | 5/195 [00:08<04:16, 1.35s/it]
[2024-03-18 20:21:24] INFO group_quantization.py:232: Compiling quantize function for key: ((4096, 4096), float32, cuda, axis=1, output_transpose=False) |
|
3%|βββ | 5/195 [00:08<04:16, 1.35s/it]
[2024-03-18 20:21:25] INFO huggingface_loader.py:164: [Quantized] Parameter: "[1mmodel.layers.0.self_attn.o_proj.q_weight[0m", shape: (4096, 1024), dtype: uint32 |
|
3%|βββ | 5/195 [00:08<04:16, 1.35s/it]
[2024-03-18 20:21:25] INFO huggingface_loader.py:164: [Quantized] Parameter: "[1mmodel.layers.0.self_attn.o_proj.q_scale[0m", shape: (4096, 128), dtype: float32 |
|
3%|βββ | 5/195 [00:08<04:16, 1.35s/it]
3%|βββ | 6/195 [00:08<03:42, 1.18s/it]
[2024-03-18 20:21:25] INFO huggingface_loader.py:172: [Not quantized] Parameter: "[1mmodel.layers.0.post_attention_layernorm.weight[0m", shape: (4096,), dtype: float32 |
|
3%|βββ | 6/195 [00:08<03:42, 1.18s/it]
[2024-03-18 20:21:26] INFO group_quantization.py:232: Compiling quantize function for key: ((22016, 4096), float32, cuda, axis=1, output_transpose=False) |
|
3%|βββ | 6/195 [00:10<03:42, 1.18s/it]
[2024-03-18 20:21:27] INFO huggingface_loader.py:164: [Quantized] Parameter: "[1mmodel.layers.0.mlp.gate_up_proj.q_weight[0m", shape: (22016, 1024), dtype: uint32 |
|
3%|βββ | 6/195 [00:10<03:42, 1.18s/it]
[2024-03-18 20:21:27] INFO huggingface_loader.py:164: [Quantized] Parameter: "[1mmodel.layers.0.mlp.gate_up_proj.q_scale[0m", shape: (22016, 128), dtype: float32 |
|
3%|βββ | 6/195 [00:11<03:42, 1.18s/it]
4%|ββββ | 8/195 [00:11<03:40, 1.18s/it]
[2024-03-18 20:21:28] INFO group_quantization.py:232: Compiling quantize function for key: ((4096, 11008), float32, cuda, axis=1, output_transpose=False) |
|
4%|ββββ | 8/195 [00:11<03:40, 1.18s/it]
[2024-03-18 20:21:28] INFO huggingface_loader.py:164: [Quantized] Parameter: "[1mmodel.layers.0.mlp.down_proj.q_weight[0m", shape: (4096, 2752), dtype: uint32 |
|
4%|ββββ | 8/195 [00:12<03:40, 1.18s/it]
[2024-03-18 20:21:28] INFO huggingface_loader.py:164: [Quantized] Parameter: "[1mmodel.layers.0.mlp.down_proj.q_scale[0m", shape: (4096, 344), dtype: float32 |
|
4%|ββββ | 8/195 [00:12<03:40, 1.18s/it]
5%|βββββ | 9/195 [00:12<03:45, 1.21s/it]
[2024-03-18 20:21:28] INFO huggingface_loader.py:172: [Not quantized] Parameter: "[1mmodel.layers.1.input_layernorm.weight[0m", shape: (4096,), dtype: float32 |
|
5%|βββββ | 9/195 [00:12<03:45, 1.21s/it]
[2024-03-18 20:21:29] INFO huggingface_loader.py:164: [Quantized] Parameter: "[1mmodel.layers.1.self_attn.qkv_proj.q_weight[0m", shape: (12288, 1024), dtype: uint32 |
|
5%|βββββ | 9/195 [00:13<03:45, 1.21s/it]
[2024-03-18 20:21:29] INFO huggingface_loader.py:164: [Quantized] Parameter: "[1mmodel.layers.1.self_attn.qkv_proj.q_scale[0m", shape: (12288, 128), dtype: float32 |
|
5%|βββββ | 9/195 [00:13<03:45, 1.21s/it]
6%|ββββββ | 11/195 [00:13<02:48, 1.09it/s]
[2024-03-18 20:21:30] INFO huggingface_loader.py:164: [Quantized] Parameter: "[1mmodel.layers.1.self_attn.o_proj.q_weight[0m", shape: (4096, 1024), dtype: uint32 |
|
6%|ββββββ | 11/195 [00:13<02:48, 1.09it/s]
[2024-03-18 20:21:30] INFO huggingface_loader.py:164: [Quantized] Parameter: "[1mmodel.layers.1.self_attn.o_proj.q_scale[0m", shape: (4096, 128), dtype: float32 |
|
6%|ββββββ | 11/195 [00:13<02:48, 1.09it/s]
6%|ββββββ | 12/195 [00:13<02:22, 1.28it/s]
[2024-03-18 20:21:30] INFO huggingface_loader.py:172: [Not quantized] Parameter: "[1mmodel.layers.1.post_attention_layernorm.weight[0m", shape: (4096,), dtype: float32 |
|
6%|ββββββ | 12/195 [00:13<02:22, 1.28it/s]
[2024-03-18 20:21:31] INFO huggingface_loader.py:164: [Quantized] Parameter: "[1mmodel.layers.1.mlp.gate_up_proj.q_weight[0m", shape: (22016, 1024), dtype: uint32 |
|
6%|ββββββ | 12/195 [00:15<02:22, 1.28it/s]
[2024-03-18 20:21:32] INFO huggingface_loader.py:164: [Quantized] Parameter: "[1mmodel.layers.1.mlp.gate_up_proj.q_scale[0m", shape: (22016, 128), dtype: float32 |
|
6%|ββββββ | 12/195 [00:15<02:22, 1.28it/s]
7%|βββββββ | 14/195 [00:15<02:37, 1.15it/s]
[2024-03-18 20:21:32] INFO huggingface_loader.py:164: [Quantized] Parameter: "[1mmodel.layers.1.mlp.down_proj.q_weight[0m", shape: (4096, 2752), dtype: uint32 |
|
7%|βββββββ | 14/195 [00:16<02:37, 1.15it/s]
[2024-03-18 20:21:33] INFO huggingface_loader.py:164: [Quantized] Parameter: "[1mmodel.layers.1.mlp.down_proj.q_scale[0m", shape: (4096, 344), dtype: float32 |
|
7%|βββββββ | 14/195 [00:16<02:37, 1.15it/s]
8%|βββββββ | 15/195 [00:16<02:36, 1.15it/s]
[2024-03-18 20:21:33] INFO huggingface_loader.py:172: [Not quantized] Parameter: "[1mmodel.layers.2.input_layernorm.weight[0m", shape: (4096,), dtype: float32 |
|
8%|βββββββ | 15/195 [00:16<02:36, 1.15it/s]
[2024-03-18 20:21:33] INFO huggingface_loader.py:164: [Quantized] Parameter: "[1mmodel.layers.2.self_attn.qkv_proj.q_weight[0m", shape: (12288, 1024), dtype: uint32 |
|
8%|βββββββ | 15/195 [00:17<02:36, 1.15it/s]
[2024-03-18 20:21:33] INFO huggingface_loader.py:164: [Quantized] Parameter: "[1mmodel.layers.2.self_attn.qkv_proj.q_scale[0m", shape: (12288, 128), dtype: float32 |
|
8%|βββββββ | 15/195 [00:17<02:36, 1.15it/s]
9%|ββββββββ | 17/195 [00:17<02:08, 1.39it/s]
[2024-03-18 20:21:34] INFO huggingface_loader.py:164: [Quantized] Parameter: "[1mmodel.layers.2.self_attn.o_proj.q_weight[0m", shape: (4096, 1024), dtype: uint32 |
|
9%|ββββββββ | 17/195 [00:17<02:08, 1.39it/s]
[2024-03-18 20:21:34] INFO huggingface_loader.py:164: [Quantized] Parameter: "[1mmodel.layers.2.self_attn.o_proj.q_scale[0m", shape: (4096, 128), dtype: float32 |
|
9%|ββββββββ | 17/195 [00:17<02:08, 1.39it/s]
9%|βββββββββ | 18/195 [00:17<01:51, 1.59it/s]
[2024-03-18 20:21:34] INFO huggingface_loader.py:172: [Not quantized] Parameter: "[1mmodel.layers.2.post_attention_layernorm.weight[0m", shape: (4096,), dtype: float32 |
|
9%|βββββββββ | 18/195 [00:17<01:51, 1.59it/s]
[2024-03-18 20:21:35] INFO huggingface_loader.py:164: [Quantized] Parameter: "[1mmodel.layers.2.mlp.gate_up_proj.q_weight[0m", shape: (22016, 1024), dtype: uint32 |
|
9%|βββββββββ | 18/195 [00:19<01:51, 1.59it/s]
[2024-03-18 20:21:36] INFO huggingface_loader.py:164: [Quantized] Parameter: "[1mmodel.layers.2.mlp.gate_up_proj.q_scale[0m", shape: (22016, 128), dtype: float32 |
|
9%|βββββββββ | 18/195 [00:19<01:51, 1.59it/s]
10%|ββββββββββ | 20/195 [00:19<02:10, 1.34it/s]
[2024-03-18 20:21:36] INFO huggingface_loader.py:164: [Quantized] Parameter: "[1mmodel.layers.2.mlp.down_proj.q_weight[0m", shape: (4096, 2752), dtype: uint32 |
|
10%|ββββββββββ | 20/195 [00:20<02:10, 1.34it/s]
[2024-03-18 20:21:36] INFO huggingface_loader.py:164: [Quantized] Parameter: "[1mmodel.layers.2.mlp.down_proj.q_scale[0m", shape: (4096, 344), dtype: float32 |
|
10%|ββββββββββ | 20/195 [00:20<02:10, 1.34it/s]
11%|ββββββββββ | 21/195 [00:20<02:13, 1.30it/s]
[2024-03-18 20:21:37] INFO huggingface_loader.py:172: [Not quantized] Parameter: "[1mmodel.layers.3.input_layernorm.weight[0m", shape: (4096,), dtype: float32 |
|
11%|ββββββββββ | 21/195 [00:20<02:13, 1.30it/s]
[2024-03-18 20:21:37] INFO huggingface_loader.py:164: [Quantized] Parameter: "[1mmodel.layers.3.self_attn.qkv_proj.q_weight[0m", shape: (12288, 1024), dtype: uint32 |
|
11%|ββββββββββ | 21/195 [00:21<02:13, 1.30it/s]
[2024-03-18 20:21:37] INFO huggingface_loader.py:164: [Quantized] Parameter: "[1mmodel.layers.3.self_attn.qkv_proj.q_scale[0m", shape: (12288, 128), dtype: float32 |
|
11%|ββββββββββ | 21/195 [00:21<02:13, 1.30it/s]
12%|βββββββββββ | 23/195 [00:21<01:52, 1.52it/s]
[2024-03-18 20:21:38] INFO huggingface_loader.py:164: [Quantized] Parameter: "[1mmodel.layers.3.self_attn.o_proj.q_weight[0m", shape: (4096, 1024), dtype: uint32 |
|
12%|βββββββββββ | 23/195 [00:21<01:52, 1.52it/s]
[2024-03-18 20:21:38] INFO huggingface_loader.py:164: [Quantized] Parameter: "[1mmodel.layers.3.self_attn.o_proj.q_scale[0m", shape: (4096, 128), dtype: float32 |
|
12%|βββββββββββ | 23/195 [00:21<01:52, 1.52it/s]
12%|ββββββββββββ | 24/195 [00:21<01:39, 1.72it/s]
[2024-03-18 20:21:38] INFO huggingface_loader.py:172: [Not quantized] Parameter: "[1mmodel.layers.3.post_attention_layernorm.weight[0m", shape: (4096,), dtype: float32 |
|
12%|ββββββββββββ | 24/195 [00:21<01:39, 1.72it/s]
[2024-03-18 20:21:39] INFO huggingface_loader.py:164: [Quantized] Parameter: "[1mmodel.layers.3.mlp.gate_up_proj.q_weight[0m", shape: (22016, 1024), dtype: uint32 |
|
12%|ββββββββββββ | 24/195 [00:23<01:39, 1.72it/s]
[2024-03-18 20:21:40] INFO huggingface_loader.py:164: [Quantized] Parameter: "[1mmodel.layers.3.mlp.gate_up_proj.q_scale[0m", shape: (22016, 128), dtype: float32 |
|
12%|ββββββββββββ | 24/195 [00:23<01:39, 1.72it/s]
13%|βββββββββββββ | 26/195 [00:24<02:10, 1.30it/s]
[2024-03-18 20:21:41] INFO huggingface_loader.py:164: [Quantized] Parameter: "[1mmodel.layers.3.mlp.down_proj.q_weight[0m", shape: (4096, 2752), dtype: uint32 |
|
13%|βββββββββββββ | 26/195 [00:24<02:10, 1.30it/s]
[2024-03-18 20:21:41] INFO huggingface_loader.py:164: [Quantized] Parameter: "[1mmodel.layers.3.mlp.down_proj.q_scale[0m", shape: (4096, 344), dtype: float32 |
|
13%|βββββββββββββ | 26/195 [00:24<02:10, 1.30it/s]
14%|βββββββββββββ | 27/195 [00:24<02:11, 1.28it/s]
[2024-03-18 20:21:41] INFO huggingface_loader.py:172: [Not quantized] Parameter: "[1mmodel.layers.4.input_layernorm.weight[0m", shape: (4096,), dtype: float32 |
|
14%|βββββββββββββ | 27/195 [00:24<02:11, 1.28it/s]
[2024-03-18 20:21:41] INFO huggingface_loader.py:164: [Quantized] Parameter: "[1mmodel.layers.4.self_attn.qkv_proj.q_weight[0m", shape: (12288, 1024), dtype: uint32 |
|
14%|βββββββββββββ | 27/195 [00:25<02:11, 1.28it/s]
[2024-03-18 20:21:42] INFO huggingface_loader.py:164: [Quantized] Parameter: "[1mmodel.layers.4.self_attn.qkv_proj.q_scale[0m", shape: (12288, 128), dtype: float32 |
|
14%|βββββββββββββ | 27/195 [00:25<02:11, 1.28it/s]
15%|ββββββββββββββ | 29/195 [00:25<01:50, 1.51it/s]
[2024-03-18 20:21:42] INFO huggingface_loader.py:164: [Quantized] Parameter: "[1mmodel.layers.4.self_attn.o_proj.q_weight[0m", shape: (4096, 1024), dtype: uint32 |
|
15%|ββββββββββββββ | 29/195 [00:26<01:50, 1.51it/s]
[2024-03-18 20:21:42] INFO huggingface_loader.py:164: [Quantized] Parameter: "[1mmodel.layers.4.self_attn.o_proj.q_scale[0m", shape: (4096, 128), dtype: float32 |
|
15%|ββββββββββββββ | 29/195 [00:26<01:50, 1.51it/s]
15%|ββββββββββββββ | 30/195 [00:26<01:36, 1.70it/s]
[2024-03-18 20:21:42] INFO huggingface_loader.py:172: [Not quantized] Parameter: "[1mmodel.layers.4.post_attention_layernorm.weight[0m", shape: (4096,), dtype: float32 |
|
15%|ββββββββββββββ | 30/195 [00:26<01:36, 1.70it/s]
[2024-03-18 20:21:44] INFO huggingface_loader.py:164: [Quantized] Parameter: "[1mmodel.layers.4.mlp.gate_up_proj.q_weight[0m", shape: (22016, 1024), dtype: uint32 |
|
15%|ββββββββββββββ | 30/195 [00:28<01:36, 1.70it/s]
[2024-03-18 20:21:44] INFO huggingface_loader.py:164: [Quantized] Parameter: "[1mmodel.layers.4.mlp.gate_up_proj.q_scale[0m", shape: (22016, 128), dtype: float32 |
|
15%|ββββββββββββββ | 30/195 [00:28<01:36, 1.70it/s]
16%|βββββββββββββββ | 32/195 [00:28<02:16, 1.19it/s]
[2024-03-18 20:21:45] INFO huggingface_loader.py:164: [Quantized] Parameter: "[1mmodel.layers.4.mlp.down_proj.q_weight[0m", shape: (4096, 2752), dtype: uint32 |
|
16%|βββββββββββββββ | 32/195 [00:29<02:16, 1.19it/s]
[2024-03-18 20:21:45] INFO huggingface_loader.py:164: [Quantized] Parameter: "[1mmodel.layers.4.mlp.down_proj.q_scale[0m", shape: (4096, 344), dtype: float32 |
|
16%|βββββββββββββββ | 32/195 [00:29<02:16, 1.19it/s]
17%|ββββββββββββββββ | 33/195 [00:29<02:17, 1.18it/s]
[2024-03-18 20:21:45] INFO huggingface_loader.py:172: [Not quantized] Parameter: "[1mmodel.layers.5.input_layernorm.weight[0m", shape: (4096,), dtype: float32 |
|
17%|ββββββββββββββββ | 33/195 [00:29<02:17, 1.18it/s]
[2024-03-18 20:21:46] INFO huggingface_loader.py:164: [Quantized] Parameter: "[1mmodel.layers.5.self_attn.qkv_proj.q_weight[0m", shape: (12288, 1024), dtype: uint32 |
|
17%|ββββββββββββββββ | 33/195 [00:30<02:17, 1.18it/s]
[2024-03-18 20:21:46] INFO huggingface_loader.py:164: [Quantized] Parameter: "[1mmodel.layers.5.self_attn.qkv_proj.q_scale[0m", shape: (12288, 128), dtype: float32 |
|
17%|ββββββββββββββββ | 33/195 [00:30<02:17, 1.18it/s]
18%|βββββββββββββββββ | 35/195 [00:30<01:56, 1.38it/s]
[2024-03-18 20:21:47] INFO huggingface_loader.py:164: [Quantized] Parameter: "[1mmodel.layers.5.self_attn.o_proj.q_weight[0m", shape: (4096, 1024), dtype: uint32 |
|
18%|βββββββββββββββββ | 35/195 [00:30<01:56, 1.38it/s]
[2024-03-18 20:21:47] INFO huggingface_loader.py:164: [Quantized] Parameter: "[1mmodel.layers.5.self_attn.o_proj.q_scale[0m", shape: (4096, 128), dtype: float32 |
|
18%|βββββββββββββββββ | 35/195 [00:30<01:56, 1.38it/s]
18%|βββββββββββββββββ | 36/195 [00:30<01:41, 1.57it/s]
[2024-03-18 20:21:47] INFO huggingface_loader.py:172: [Not quantized] Parameter: "[1mmodel.layers.5.post_attention_layernorm.weight[0m", shape: (4096,), dtype: float32 |
|
18%|βββββββββββββββββ | 36/195 [00:30<01:41, 1.57it/s]
[2024-03-18 20:21:48] INFO huggingface_loader.py:164: [Quantized] Parameter: "[1mmodel.layers.5.mlp.gate_up_proj.q_weight[0m", shape: (22016, 1024), dtype: uint32 |
|
18%|βββββββββββββββββ | 36/195 [00:32<01:41, 1.57it/s]
[2024-03-18 20:21:49] INFO huggingface_loader.py:164: [Quantized] Parameter: "[1mmodel.layers.5.mlp.gate_up_proj.q_scale[0m", shape: (22016, 128), dtype: float32 |
|
18%|βββββββββββββββββ | 36/195 [00:32<01:41, 1.57it/s]
19%|ββββββββββββββββββ | 38/195 [00:32<02:05, 1.25it/s]
[2024-03-18 20:21:50] INFO huggingface_loader.py:164: [Quantized] Parameter: "[1mmodel.layers.5.mlp.down_proj.q_weight[0m", shape: (4096, 2752), dtype: uint32 |
|
19%|ββββββββββββββββββ | 38/195 [00:33<02:05, 1.25it/s]
[2024-03-18 20:21:50] INFO huggingface_loader.py:164: [Quantized] Parameter: "[1mmodel.layers.5.mlp.down_proj.q_scale[0m", shape: (4096, 344), dtype: float32 |
|
19%|ββββββββββββββββββ | 38/195 [00:33<02:05, 1.25it/s]
20%|βββββββββββββββββββ | 39/195 [00:33<02:08, 1.21it/s]
[2024-03-18 20:21:50] INFO huggingface_loader.py:172: [Not quantized] Parameter: "[1mmodel.layers.6.input_layernorm.weight[0m", shape: (4096,), dtype: float32 |
|
20%|βββββββββββββββββββ | 39/195 [00:33<02:08, 1.21it/s]
[2024-03-18 20:21:50] INFO huggingface_loader.py:164: [Quantized] Parameter: "[1mmodel.layers.6.self_attn.qkv_proj.q_weight[0m", shape: (12288, 1024), dtype: uint32 |
|
20%|βββββββββββββββββββ | 39/195 [00:34<02:08, 1.21it/s]
[2024-03-18 20:21:51] INFO huggingface_loader.py:164: [Quantized] Parameter: "[1mmodel.layers.6.self_attn.qkv_proj.q_scale[0m", shape: (12288, 128), dtype: float32 |
|
20%|βββββββββββββββββββ | 39/195 [00:34<02:08, 1.21it/s]
21%|ββββββββββββββββββββ | 41/195 [00:34<01:46, 1.45it/s]
[2024-03-18 20:21:51] INFO huggingface_loader.py:164: [Quantized] Parameter: "[1mmodel.layers.6.self_attn.o_proj.q_weight[0m", shape: (4096, 1024), dtype: uint32 |
|
21%|ββββββββββββββββββββ | 41/195 [00:35<01:46, 1.45it/s]
[2024-03-18 20:21:51] INFO huggingface_loader.py:164: [Quantized] Parameter: "[1mmodel.layers.6.self_attn.o_proj.q_scale[0m", shape: (4096, 128), dtype: float32 |
|
21%|ββββββββββββββββββββ | 41/195 [00:35<01:46, 1.45it/s]
22%|ββββββββββββββββββββ | 42/195 [00:35<01:32, 1.65it/s]
[2024-03-18 20:21:51] INFO huggingface_loader.py:172: [Not quantized] Parameter: "[1mmodel.layers.6.post_attention_layernorm.weight[0m", shape: (4096,), dtype: float32 |
|
22%|ββββββββββββββββββββ | 42/195 [00:35<01:32, 1.65it/s]
[2024-03-18 20:21:53] INFO huggingface_loader.py:164: [Quantized] Parameter: "[1mmodel.layers.6.mlp.gate_up_proj.q_weight[0m", shape: (22016, 1024), dtype: uint32 |
|
22%|ββββββββββββββββββββ | 42/195 [00:36<01:32, 1.65it/s]
[2024-03-18 20:21:53] INFO huggingface_loader.py:164: [Quantized] Parameter: "[1mmodel.layers.6.mlp.gate_up_proj.q_scale[0m", shape: (22016, 128), dtype: float32 |
|
22%|ββββββββββββββββββββ | 42/195 [00:37<01:32, 1.65it/s]
23%|βββββββββββββββββββββ | 44/195 [00:37<02:01, 1.24it/s]
[2024-03-18 20:21:54] INFO huggingface_loader.py:164: [Quantized] Parameter: "[1mmodel.layers.6.mlp.down_proj.q_weight[0m", shape: (4096, 2752), dtype: uint32 |
|
23%|βββββββββββββββββββββ | 44/195 [00:37<02:01, 1.24it/s]
[2024-03-18 20:21:54] INFO huggingface_loader.py:164: [Quantized] Parameter: "[1mmodel.layers.6.mlp.down_proj.q_scale[0m", shape: (4096, 344), dtype: float32 |
|
23%|βββββββββββββββββββββ | 44/195 [00:38<02:01, 1.24it/s]
23%|βββββββββββββββββββββ | 45/195 [00:38<02:02, 1.23it/s]
[2024-03-18 20:21:54] INFO huggingface_loader.py:172: [Not quantized] Parameter: "[1mmodel.layers.7.input_layernorm.weight[0m", shape: (4096,), dtype: float32 |
|
23%|βββββββββββββββββββββ | 45/195 [00:38<02:02, 1.23it/s]
[2024-03-18 20:21:55] INFO huggingface_loader.py:164: [Quantized] Parameter: "[1mmodel.layers.7.self_attn.qkv_proj.q_weight[0m", shape: (12288, 1024), dtype: uint32 |
|
23%|βββββββββββββββββββββ | 45/195 [00:38<02:02, 1.23it/s]
[2024-03-18 20:21:55] INFO huggingface_loader.py:164: [Quantized] Parameter: "[1mmodel.layers.7.self_attn.qkv_proj.q_scale[0m", shape: (12288, 128), dtype: float32 |
|
23%|βββββββββββββββββββββ | 45/195 [00:39<02:02, 1.23it/s]
24%|ββββββββββββββββββββββ | 47/195 [00:39<01:41, 1.46it/s]
[2024-03-18 20:21:55] INFO huggingface_loader.py:164: [Quantized] Parameter: "[1mmodel.layers.7.self_attn.o_proj.q_weight[0m", shape: (4096, 1024), dtype: uint32 |
|
24%|ββββββββββββββββββββββ | 47/195 [00:39<01:41, 1.46it/s]
[2024-03-18 20:21:55] INFO huggingface_loader.py:164: [Quantized] Parameter: "[1mmodel.layers.7.self_attn.o_proj.q_scale[0m", shape: (4096, 128), dtype: float32 |
|
24%|ββββββββββββββββββββββ | 47/195 [00:39<01:41, 1.46it/s]
25%|βββββββββββββββββββββββ | 48/195 [00:39<01:29, 1.64it/s]
[2024-03-18 20:21:55] INFO huggingface_loader.py:172: [Not quantized] Parameter: "[1mmodel.layers.7.post_attention_layernorm.weight[0m", shape: (4096,), dtype: float32 |
|
25%|βββββββββββββββββββββββ | 48/195 [00:39<01:29, 1.64it/s]
[2024-03-18 20:21:57] INFO huggingface_loader.py:164: [Quantized] Parameter: "[1mmodel.layers.7.mlp.gate_up_proj.q_weight[0m", shape: (22016, 1024), dtype: uint32 |
|
25%|βββββββββββββββββββββββ | 48/195 [00:40<01:29, 1.64it/s]
[2024-03-18 20:21:57] INFO huggingface_loader.py:164: [Quantized] Parameter: "[1mmodel.layers.7.mlp.gate_up_proj.q_scale[0m", shape: (22016, 128), dtype: float32 |
|
25%|βββββββββββββββββββββββ | 48/195 [00:41<01:29, 1.64it/s]
26%|ββββββββββββββββββββββββ | 50/195 [00:41<01:45, 1.37it/s]
[2024-03-18 20:21:58] INFO huggingface_loader.py:164: [Quantized] Parameter: "[1mmodel.layers.7.mlp.down_proj.q_weight[0m", shape: (4096, 2752), dtype: uint32 |
|
26%|ββββββββββββββββββββββββ | 50/195 [00:41<01:45, 1.37it/s]
[2024-03-18 20:21:58] INFO huggingface_loader.py:164: [Quantized] Parameter: "[1mmodel.layers.7.mlp.down_proj.q_scale[0m", shape: (4096, 344), dtype: float32 |
|
26%|ββββββββββββββββββββββββ | 50/195 [00:42<01:45, 1.37it/s]
26%|ββββββββββββββββββββββββ | 51/195 [00:42<01:47, 1.34it/s]
[2024-03-18 20:21:58] INFO huggingface_loader.py:172: [Not quantized] Parameter: "[1mmodel.layers.8.input_layernorm.weight[0m", shape: (4096,), dtype: float32 |
|
26%|ββββββββββββββββββββββββ | 51/195 [00:42<01:47, 1.34it/s]
[2024-03-18 20:21:59] INFO huggingface_loader.py:164: [Quantized] Parameter: "[1mmodel.layers.8.self_attn.qkv_proj.q_weight[0m", shape: (12288, 1024), dtype: uint32 |
|
26%|ββββββββββββββββββββββββ | 51/195 [00:42<01:47, 1.34it/s]
[2024-03-18 20:21:59] INFO huggingface_loader.py:164: [Quantized] Parameter: "[1mmodel.layers.8.self_attn.qkv_proj.q_scale[0m", shape: (12288, 128), dtype: float32 |
|
26%|ββββββββββββββββββββββββ | 51/195 [00:43<01:47, 1.34it/s]
27%|βββββββββββββββββββββββββ | 53/195 [00:43<01:33, 1.51it/s]
[2024-03-18 20:21:59] INFO huggingface_loader.py:164: [Quantized] Parameter: "[1mmodel.layers.8.self_attn.o_proj.q_weight[0m", shape: (4096, 1024), dtype: uint32 |
|
27%|βββββββββββββββββββββββββ | 53/195 [00:43<01:33, 1.51it/s]
[2024-03-18 20:21:59] INFO huggingface_loader.py:164: [Quantized] Parameter: "[1mmodel.layers.8.self_attn.o_proj.q_scale[0m", shape: (4096, 128), dtype: float32 |
|
27%|βββββββββββββββββββββββββ | 53/195 [00:43<01:33, 1.51it/s]
28%|ββββββββββββββββββββββββββ | 54/195 [00:43<01:23, 1.69it/s]
[2024-03-18 20:21:59] INFO huggingface_loader.py:172: [Not quantized] Parameter: "[1mmodel.layers.8.post_attention_layernorm.weight[0m", shape: (4096,), dtype: float32 |
|
28%|ββββββββββββββββββββββββββ | 54/195 [00:43<01:23, 1.69it/s]
[2024-03-18 20:22:01] INFO huggingface_loader.py:164: [Quantized] Parameter: "[1mmodel.layers.8.mlp.gate_up_proj.q_weight[0m", shape: (22016, 1024), dtype: uint32 |
|
28%|ββββββββββββββββββββββββββ | 54/195 [00:44<01:23, 1.69it/s]
[2024-03-18 20:22:01] INFO huggingface_loader.py:164: [Quantized] Parameter: "[1mmodel.layers.8.mlp.gate_up_proj.q_scale[0m", shape: (22016, 128), dtype: float32 |
|
28%|ββββββββββββββββββββββββββ | 54/195 [00:45<01:23, 1.69it/s]
29%|βββββββββββββββββββββββββββ | 56/195 [00:45<01:41, 1.37it/s]
[2024-03-18 20:22:02] INFO huggingface_loader.py:164: [Quantized] Parameter: "[1mmodel.layers.8.mlp.down_proj.q_weight[0m", shape: (4096, 2752), dtype: uint32 |
|
29%|βββββββββββββββββββββββββββ | 56/195 [00:46<01:41, 1.37it/s]
[2024-03-18 20:22:02] INFO huggingface_loader.py:164: [Quantized] Parameter: "[1mmodel.layers.8.mlp.down_proj.q_scale[0m", shape: (4096, 344), dtype: float32 |
|
29%|βββββββββββββββββββββββββββ | 56/195 [00:46<01:41, 1.37it/s]
29%|βββββββββββββββββββββββββββ | 57/195 [00:46<01:44, 1.32it/s]
[2024-03-18 20:22:02] INFO huggingface_loader.py:172: [Not quantized] Parameter: "[1mmodel.layers.9.input_layernorm.weight[0m", shape: (4096,), dtype: float32 |
|
29%|βββββββββββββββββββββββββββ | 57/195 [00:46<01:44, 1.32it/s]
[2024-03-18 20:22:03] INFO huggingface_loader.py:164: [Quantized] Parameter: "[1mmodel.layers.9.self_attn.qkv_proj.q_weight[0m", shape: (12288, 1024), dtype: uint32 |
|
29%|βββββββββββββββββββββββββββ | 57/195 [00:46<01:44, 1.32it/s]
[2024-03-18 20:22:03] INFO huggingface_loader.py:164: [Quantized] Parameter: "[1mmodel.layers.9.self_attn.qkv_proj.q_scale[0m", shape: (12288, 128), dtype: float32 |
|
29%|βββββββββββββββββββββββββββ | 57/195 [00:47<01:44, 1.32it/s]
30%|ββββββββββββββββββββββββββββ | 59/195 [00:47<01:28, 1.54it/s]
[2024-03-18 20:22:03] INFO huggingface_loader.py:164: [Quantized] Parameter: "[1mmodel.layers.9.self_attn.o_proj.q_weight[0m", shape: (4096, 1024), dtype: uint32 |
|
30%|ββββββββββββββββββββββββββββ | 59/195 [00:47<01:28, 1.54it/s]
[2024-03-18 20:22:03] INFO huggingface_loader.py:164: [Quantized] Parameter: "[1mmodel.layers.9.self_attn.o_proj.q_scale[0m", shape: (4096, 128), dtype: float32 |
|
30%|ββββββββββββββββββββββββββββ | 59/195 [00:47<01:28, 1.54it/s]
31%|ββββββββββββββββββββββββββββ | 60/195 [00:47<01:17, 1.74it/s]
[2024-03-18 20:22:03] INFO huggingface_loader.py:172: [Not quantized] Parameter: "[1mmodel.layers.9.post_attention_layernorm.weight[0m", shape: (4096,), dtype: float32 |
|
31%|ββββββββββββββββββββββββββββ | 60/195 [00:47<01:17, 1.74it/s]
[2024-03-18 20:22:05] INFO huggingface_loader.py:164: [Quantized] Parameter: "[1mmodel.layers.9.mlp.gate_up_proj.q_weight[0m", shape: (22016, 1024), dtype: uint32 |
|
31%|ββββββββββββββββββββββββββββ | 60/195 [00:48<01:17, 1.74it/s]
[2024-03-18 20:22:05] INFO huggingface_loader.py:164: [Quantized] Parameter: "[1mmodel.layers.9.mlp.gate_up_proj.q_scale[0m", shape: (22016, 128), dtype: float32 |
|
31%|ββββββββββββββββββββββββββββ | 60/195 [00:49<01:17, 1.74it/s]
32%|βββββββββββββββββββββββββββββ | 62/195 [00:49<01:36, 1.37it/s]
[2024-03-18 20:22:06] INFO huggingface_loader.py:164: [Quantized] Parameter: "[1mmodel.layers.9.mlp.down_proj.q_weight[0m", shape: (4096, 2752), dtype: uint32 |
|
32%|βββββββββββββββββββββββββββββ | 62/195 [00:50<01:36, 1.37it/s]
[2024-03-18 20:22:06] INFO huggingface_loader.py:164: [Quantized] Parameter: "[1mmodel.layers.9.mlp.down_proj.q_scale[0m", shape: (4096, 344), dtype: float32 |
|
32%|βββββββββββββββββββββββββββββ | 62/195 [00:50<01:36, 1.37it/s]
32%|ββββββββββββββββββββββββββββββ | 63/195 [00:50<01:41, 1.29it/s]
[2024-03-18 20:22:06] INFO huggingface_loader.py:172: [Not quantized] Parameter: "[1mmodel.layers.10.input_layernorm.weight[0m", shape: (4096,), dtype: float32 |
|
32%|ββββββββββββββββββββββββββββββ | 63/195 [00:50<01:41, 1.29it/s]
[2024-03-18 20:22:07] INFO huggingface_loader.py:164: [Quantized] Parameter: "[1mmodel.layers.10.self_attn.qkv_proj.q_weight[0m", shape: (12288, 1024), dtype: uint32 |
|
32%|ββββββββββββββββββββββββββββββ | 63/195 [00:51<01:41, 1.29it/s]
[2024-03-18 20:22:07] INFO huggingface_loader.py:164: [Quantized] Parameter: "[1mmodel.layers.10.self_attn.qkv_proj.q_scale[0m", shape: (12288, 128), dtype: float32 |
|
32%|ββββββββββββββββββββββββββββββ | 63/195 [00:51<01:41, 1.29it/s]
33%|βββββββββββββββββββββββββββββββ | 65/195 [00:51<01:28, 1.47it/s]
[2024-03-18 20:22:08] INFO huggingface_loader.py:164: [Quantized] Parameter: "[1mmodel.layers.10.self_attn.o_proj.q_weight[0m", shape: (4096, 1024), dtype: uint32 |
|
33%|βββββββββββββββββββββββββββββββ | 65/195 [00:51<01:28, 1.47it/s]
[2024-03-18 20:22:08] INFO huggingface_loader.py:164: [Quantized] Parameter: "[1mmodel.layers.10.self_attn.o_proj.q_scale[0m", shape: (4096, 128), dtype: float32 |
|
33%|βββββββββββββββββββββββββββββββ | 65/195 [00:51<01:28, 1.47it/s]
34%|βββββββββββββββββββββββββββββββ | 66/195 [00:51<01:17, 1.67it/s]
[2024-03-18 20:22:08] INFO huggingface_loader.py:172: [Not quantized] Parameter: "[1mmodel.layers.10.post_attention_layernorm.weight[0m", shape: (4096,), dtype: float32 |
|
34%|βββββββββββββββββββββββββββββββ | 66/195 [00:51<01:17, 1.67it/s]
[2024-03-18 20:22:09] INFO huggingface_loader.py:164: [Quantized] Parameter: "[1mmodel.layers.10.mlp.gate_up_proj.q_weight[0m", shape: (22016, 1024), dtype: uint32 |
|
34%|βββββββββββββββββββββββββββββββ | 66/195 [00:53<01:17, 1.67it/s]
[2024-03-18 20:22:10] INFO huggingface_loader.py:164: [Quantized] Parameter: "[1mmodel.layers.10.mlp.gate_up_proj.q_scale[0m", shape: (22016, 128), dtype: float32 |
|
34%|βββββββββββββββββββββββββββββββ | 66/195 [00:53<01:17, 1.67it/s]
35%|ββββββββββββββββββββββββββββββββ | 68/195 [00:53<01:36, 1.32it/s]
[2024-03-18 20:22:10] INFO huggingface_loader.py:164: [Quantized] Parameter: "[1mmodel.layers.10.mlp.down_proj.q_weight[0m", shape: (4096, 2752), dtype: uint32 |
|
35%|ββββββββββββββββββββββββββββββββ | 68/195 [00:54<01:36, 1.32it/s]
[2024-03-18 20:22:11] INFO huggingface_loader.py:164: [Quantized] Parameter: "[1mmodel.layers.10.mlp.down_proj.q_scale[0m", shape: (4096, 344), dtype: float32 |
|
35%|ββββββββββββββββββββββββββββββββ | 68/195 [00:54<01:36, 1.32it/s]
35%|βββββββββββββββββββββββββββββββββ | 69/195 [00:54<01:37, 1.29it/s]
[2024-03-18 20:22:11] INFO huggingface_loader.py:172: [Not quantized] Parameter: "[1mmodel.layers.11.input_layernorm.weight[0m", shape: (4096,), dtype: float32 |
|
35%|βββββββββββββββββββββββββββββββββ | 69/195 [00:54<01:37, 1.29it/s]
[2024-03-18 20:22:11] INFO huggingface_loader.py:164: [Quantized] Parameter: "[1mmodel.layers.11.self_attn.qkv_proj.q_weight[0m", shape: (12288, 1024), dtype: uint32 |
|
35%|βββββββββββββββββββββββββββββββββ | 69/195 [00:55<01:37, 1.29it/s]
[2024-03-18 20:22:12] INFO huggingface_loader.py:164: [Quantized] Parameter: "[1mmodel.layers.11.self_attn.qkv_proj.q_scale[0m", shape: (12288, 128), dtype: float32 |
|
35%|βββββββββββββββββββββββββββββββββ | 69/195 [00:55<01:37, 1.29it/s]
36%|ββββββββββββββββββββββββββββββββββ | 71/195 [00:55<01:21, 1.51it/s]
[2024-03-18 20:22:12] INFO huggingface_loader.py:164: [Quantized] Parameter: "[1mmodel.layers.11.self_attn.o_proj.q_weight[0m", shape: (4096, 1024), dtype: uint32 |
|
36%|ββββββββββββββββββββββββββββββββββ | 71/195 [00:55<01:21, 1.51it/s]
[2024-03-18 20:22:12] INFO huggingface_loader.py:164: [Quantized] Parameter: "[1mmodel.layers.11.self_attn.o_proj.q_scale[0m", shape: (4096, 128), dtype: float32 |
|
36%|ββββββββββββββββββββββββββββββββββ | 71/195 [00:55<01:21, 1.51it/s]
37%|ββββββββββββββββββββββββββββββββββ | 72/195 [00:55<01:11, 1.72it/s]
[2024-03-18 20:22:12] INFO huggingface_loader.py:172: [Not quantized] Parameter: "[1mmodel.layers.11.post_attention_layernorm.weight[0m", shape: (4096,), dtype: float32 |
|
37%|ββββββββββββββββββββββββββββββββββ | 72/195 [00:55<01:11, 1.72it/s]
[2024-03-18 20:22:13] INFO huggingface_loader.py:164: [Quantized] Parameter: "[1mmodel.layers.11.mlp.gate_up_proj.q_weight[0m", shape: (22016, 1024), dtype: uint32 |
|
37%|ββββββββββββββββββββββββββββββββββ | 72/195 [00:57<01:11, 1.72it/s]
[2024-03-18 20:22:14] INFO huggingface_loader.py:164: [Quantized] Parameter: "[1mmodel.layers.11.mlp.gate_up_proj.q_scale[0m", shape: (22016, 128), dtype: float32 |
|
37%|ββββββββββββββββββββββββββββββββββ | 72/195 [00:57<01:11, 1.72it/s]
38%|βββββββββββββββββββββββββββββββββββ | 74/195 [00:57<01:29, 1.35it/s]
[2024-03-18 20:22:14] INFO huggingface_loader.py:164: [Quantized] Parameter: "[1mmodel.layers.11.mlp.down_proj.q_weight[0m", shape: (4096, 2752), dtype: uint32 |
|
38%|βββββββββββββββββββββββββββββββββββ | 74/195 [00:58<01:29, 1.35it/s]
[2024-03-18 20:22:15] INFO huggingface_loader.py:164: [Quantized] Parameter: "[1mmodel.layers.11.mlp.down_proj.q_scale[0m", shape: (4096, 344), dtype: float32 |
|
38%|βββββββββββββββββββββββββββββββββββ | 74/195 [00:58<01:29, 1.35it/s]
38%|βββββββββββββββββββββββββββββββββββ | 75/195 [00:58<01:32, 1.30it/s]
[2024-03-18 20:22:15] INFO huggingface_loader.py:172: [Not quantized] Parameter: "[1mmodel.layers.12.input_layernorm.weight[0m", shape: (4096,), dtype: float32 |
|
38%|βββββββββββββββββββββββββββββββββββ | 75/195 [00:58<01:32, 1.30it/s]
[2024-03-18 20:22:15] INFO huggingface_loader.py:164: [Quantized] Parameter: "[1mmodel.layers.12.self_attn.qkv_proj.q_weight[0m", shape: (12288, 1024), dtype: uint32 |
|
38%|βββββββββββββββββββββββββββββββββββ | 75/195 [00:59<01:32, 1.30it/s]
[2024-03-18 20:22:16] INFO huggingface_loader.py:164: [Quantized] Parameter: "[1mmodel.layers.12.self_attn.qkv_proj.q_scale[0m", shape: (12288, 128), dtype: float32 |
|
38%|βββββββββββββββββββββββββββββββββββ | 75/195 [00:59<01:32, 1.30it/s]
39%|ββββββββββββββββββββββββββββββββββββ | 77/195 [00:59<01:19, 1.48it/s]
[2024-03-18 20:22:16] INFO huggingface_loader.py:164: [Quantized] Parameter: "[1mmodel.layers.12.self_attn.o_proj.q_weight[0m", shape: (4096, 1024), dtype: uint32 |
|
39%|ββββββββββββββββββββββββββββββββββββ | 77/195 [01:00<01:19, 1.48it/s]
[2024-03-18 20:22:16] INFO huggingface_loader.py:164: [Quantized] Parameter: "[1mmodel.layers.12.self_attn.o_proj.q_scale[0m", shape: (4096, 128), dtype: float32 |
|
39%|ββββββββββββββββββββββββββββββββββββ | 77/195 [01:00<01:19, 1.48it/s]
40%|βββββββββββββββββββββββββββββββββββββ | 78/195 [01:00<01:09, 1.69it/s]
[2024-03-18 20:22:16] INFO huggingface_loader.py:172: [Not quantized] Parameter: "[1mmodel.layers.12.post_attention_layernorm.weight[0m", shape: (4096,), dtype: float32 |
|
40%|βββββββββββββββββββββββββββββββββββββ | 78/195 [01:00<01:09, 1.69it/s]
[2024-03-18 20:22:17] INFO huggingface_loader.py:164: [Quantized] Parameter: "[1mmodel.layers.12.mlp.gate_up_proj.q_weight[0m", shape: (22016, 1024), dtype: uint32 |
|
40%|βββββββββββββββββββββββββββββββββββββ | 78/195 [01:01<01:09, 1.69it/s]
[2024-03-18 20:22:18] INFO huggingface_loader.py:164: [Quantized] Parameter: "[1mmodel.layers.12.mlp.gate_up_proj.q_scale[0m", shape: (22016, 128), dtype: float32 |
|
40%|βββββββββββββββββββββββββββββββββββββ | 78/195 [01:01<01:09, 1.69it/s]
41%|ββββββββββββββββββββββββββββββββββββββ | 80/195 [01:01<01:22, 1.40it/s]
[2024-03-18 20:22:18] INFO huggingface_loader.py:164: [Quantized] Parameter: "[1mmodel.layers.12.mlp.down_proj.q_weight[0m", shape: (4096, 2752), dtype: uint32 |
|
41%|ββββββββββββββββββββββββββββββββββββββ | 80/195 [01:02<01:22, 1.40it/s]
[2024-03-18 20:22:19] INFO huggingface_loader.py:164: [Quantized] Parameter: "[1mmodel.layers.12.mlp.down_proj.q_scale[0m", shape: (4096, 344), dtype: float32 |
|
41%|ββββββββββββββββββββββββββββββββββββββ | 80/195 [01:02<01:22, 1.40it/s]
42%|ββββββββββββββββββββββββββββββββββββββ | 81/195 [01:02<01:25, 1.34it/s]
[2024-03-18 20:22:19] INFO huggingface_loader.py:172: [Not quantized] Parameter: "[1mmodel.layers.13.input_layernorm.weight[0m", shape: (4096,), dtype: float32 |
|
42%|ββββββββββββββββββββββββββββββββββββββ | 81/195 [01:02<01:25, 1.34it/s]
[2024-03-18 20:22:20] INFO huggingface_loader.py:164: [Quantized] Parameter: "[1mmodel.layers.13.self_attn.qkv_proj.q_weight[0m", shape: (12288, 1024), dtype: uint32 |
|
42%|ββββββββββββββββββββββββββββββββββββββ | 81/195 [01:03<01:25, 1.34it/s]
[2024-03-18 20:22:20] INFO huggingface_loader.py:164: [Quantized] Parameter: "[1mmodel.layers.13.self_attn.qkv_proj.q_scale[0m", shape: (12288, 128), dtype: float32 |
|
42%|ββββββββββββββββββββββββββββββββββββββ | 81/195 [01:03<01:25, 1.34it/s]
43%|βββββββββββββββββββββββββββββββββββββββ | 83/195 [01:03<01:15, 1.49it/s]
[2024-03-18 20:22:20] INFO huggingface_loader.py:164: [Quantized] Parameter: "[1mmodel.layers.13.self_attn.o_proj.q_weight[0m", shape: (4096, 1024), dtype: uint32 |
|
43%|βββββββββββββββββββββββββββββββββββββββ | 83/195 [01:04<01:15, 1.49it/s]
[2024-03-18 20:22:20] INFO huggingface_loader.py:164: [Quantized] Parameter: "[1mmodel.layers.13.self_attn.o_proj.q_scale[0m", shape: (4096, 128), dtype: float32 |
|
43%|βββββββββββββββββββββββββββββββββββββββ | 83/195 [01:04<01:15, 1.49it/s]
43%|ββββββββββββββββββββββββββββββββββββββββ | 84/195 [01:04<01:05, 1.70it/s]
[2024-03-18 20:22:20] INFO huggingface_loader.py:172: [Not quantized] Parameter: "[1mmodel.layers.13.post_attention_layernorm.weight[0m", shape: (4096,), dtype: float32 |
|
43%|ββββββββββββββββββββββββββββββββββββββββ | 84/195 [01:04<01:05, 1.70it/s]
[2024-03-18 20:22:21] INFO huggingface_loader.py:164: [Quantized] Parameter: "[1mmodel.layers.13.mlp.gate_up_proj.q_weight[0m", shape: (22016, 1024), dtype: uint32 |
|
43%|ββββββββββββββββββββββββββββββββββββββββ | 84/195 [01:05<01:05, 1.70it/s]
[2024-03-18 20:22:22] INFO huggingface_loader.py:164: [Quantized] Parameter: "[1mmodel.layers.13.mlp.gate_up_proj.q_scale[0m", shape: (22016, 128), dtype: float32 |
|
43%|ββββββββββββββββββββββββββββββββββββββββ | 84/195 [01:05<01:05, 1.70it/s]
44%|βββββββββββββββββββββββββββββββββββββββββ | 86/195 [01:06<01:17, 1.40it/s]
[2024-03-18 20:22:23] INFO huggingface_loader.py:164: [Quantized] Parameter: "[1mmodel.layers.13.mlp.down_proj.q_weight[0m", shape: (4096, 2752), dtype: uint32 |
|
44%|βββββββββββββββββββββββββββββββββββββββββ | 86/195 [01:06<01:17, 1.40it/s]
[2024-03-18 20:22:23] INFO huggingface_loader.py:164: [Quantized] Parameter: "[1mmodel.layers.13.mlp.down_proj.q_scale[0m", shape: (4096, 344), dtype: float32 |
|
44%|βββββββββββββββββββββββββββββββββββββββββ | 86/195 [01:06<01:17, 1.40it/s]
45%|βββββββββββββββββββββββββββββββββββββββββ | 87/195 [01:06<01:20, 1.35it/s]
[2024-03-18 20:22:23] INFO huggingface_loader.py:172: [Not quantized] Parameter: "[1mmodel.layers.14.input_layernorm.weight[0m", shape: (4096,), dtype: float32 |
|
45%|βββββββββββββββββββββββββββββββββββββββββ | 87/195 [01:06<01:20, 1.35it/s]
[2024-03-18 20:22:24] INFO huggingface_loader.py:164: [Quantized] Parameter: "[1mmodel.layers.14.self_attn.qkv_proj.q_weight[0m", shape: (12288, 1024), dtype: uint32 |
|
45%|βββββββββββββββββββββββββββββββββββββββββ | 87/195 [01:07<01:20, 1.35it/s]
[2024-03-18 20:22:24] INFO huggingface_loader.py:164: [Quantized] Parameter: "[1mmodel.layers.14.self_attn.qkv_proj.q_scale[0m", shape: (12288, 128), dtype: float32 |
|
45%|βββββββββββββββββββββββββββββββββββββββββ | 87/195 [01:07<01:20, 1.35it/s]
46%|ββββββββββββββββββββββββββββββββββββββββββ | 89/195 [01:07<01:10, 1.51it/s]
[2024-03-18 20:22:24] INFO huggingface_loader.py:164: [Quantized] Parameter: "[1mmodel.layers.14.self_attn.o_proj.q_weight[0m", shape: (4096, 1024), dtype: uint32 |
|
46%|ββββββββββββββββββββββββββββββββββββββββββ | 89/195 [01:08<01:10, 1.51it/s]
[2024-03-18 20:22:24] INFO huggingface_loader.py:164: [Quantized] Parameter: "[1mmodel.layers.14.self_attn.o_proj.q_scale[0m", shape: (4096, 128), dtype: float32 |
|
46%|ββββββββββββββββββββββββββββββββββββββββββ | 89/195 [01:08<01:10, 1.51it/s]
46%|ββββββββββββββββββββββββββββββββββββββββββ | 90/195 [01:08<01:01, 1.72it/s]
[2024-03-18 20:22:24] INFO huggingface_loader.py:172: [Not quantized] Parameter: "[1mmodel.layers.14.post_attention_layernorm.weight[0m", shape: (4096,), dtype: float32 |
|
46%|ββββββββββββββββββββββββββββββββββββββββββ | 90/195 [01:08<01:01, 1.72it/s]
[2024-03-18 20:22:26] INFO huggingface_loader.py:164: [Quantized] Parameter: "[1mmodel.layers.14.mlp.gate_up_proj.q_weight[0m", shape: (22016, 1024), dtype: uint32 |
|
46%|ββββββββββββββββββββββββββββββββββββββββββ | 90/195 [01:09<01:01, 1.72it/s]
[2024-03-18 20:22:26] INFO huggingface_loader.py:164: [Quantized] Parameter: "[1mmodel.layers.14.mlp.gate_up_proj.q_scale[0m", shape: (22016, 128), dtype: float32 |
|
46%|ββββββββββββββββββββββββββββββββββββββββββ | 90/195 [01:10<01:01, 1.72it/s]
47%|βββββββββββββββββββββββββββββββββββββββββββ | 92/195 [01:10<01:20, 1.28it/s]
[2024-03-18 20:22:27] INFO huggingface_loader.py:164: [Quantized] Parameter: "[1mmodel.layers.14.mlp.down_proj.q_weight[0m", shape: (4096, 2752), dtype: uint32 |
|
47%|βββββββββββββββββββββββββββββββββββββββββββ | 92/195 [01:11<01:20, 1.28it/s]
[2024-03-18 20:22:27] INFO huggingface_loader.py:164: [Quantized] Parameter: "[1mmodel.layers.14.mlp.down_proj.q_scale[0m", shape: (4096, 344), dtype: float32 |
|
47%|βββββββββββββββββββββββββββββββββββββββββββ | 92/195 [01:11<01:20, 1.28it/s]
48%|ββββββββββββββββββββββββββββββββββββββββββββ | 93/195 [01:11<01:23, 1.22it/s]
[2024-03-18 20:22:27] INFO huggingface_loader.py:172: [Not quantized] Parameter: "[1mmodel.layers.15.input_layernorm.weight[0m", shape: (4096,), dtype: float32 |
|
48%|ββββββββββββββββββββββββββββββββββββββββββββ | 93/195 [01:11<01:23, 1.22it/s]
[2024-03-18 20:22:28] INFO huggingface_loader.py:164: [Quantized] Parameter: "[1mmodel.layers.15.self_attn.qkv_proj.q_weight[0m", shape: (12288, 1024), dtype: uint32 |
|
48%|ββββββββββββββββββββββββββββββββββββββββββββ | 93/195 [01:12<01:23, 1.22it/s]
[2024-03-18 20:22:28] INFO huggingface_loader.py:164: [Quantized] Parameter: "[1mmodel.layers.15.self_attn.qkv_proj.q_scale[0m", shape: (12288, 128), dtype: float32 |
|
48%|ββββββββββββββββββββββββββββββββββββββββββββ | 93/195 [01:12<01:23, 1.22it/s]
49%|βββββββββββββββββββββββββββββββββββββββββββββ | 95/195 [01:12<01:08, 1.45it/s]
[2024-03-18 20:22:29] INFO huggingface_loader.py:164: [Quantized] Parameter: "[1mmodel.layers.15.self_attn.o_proj.q_weight[0m", shape: (4096, 1024), dtype: uint32 |
|
49%|βββββββββββββββββββββββββββββββββββββββββββββ | 95/195 [01:12<01:08, 1.45it/s]
[2024-03-18 20:22:29] INFO huggingface_loader.py:164: [Quantized] Parameter: "[1mmodel.layers.15.self_attn.o_proj.q_scale[0m", shape: (4096, 128), dtype: float32 |
|
49%|βββββββββββββββββββββββββββββββββββββββββββββ | 95/195 [01:12<01:08, 1.45it/s]
49%|βββββββββββββββββββββββββββββββββββββββββββββ | 96/195 [01:12<01:01, 1.61it/s]
[2024-03-18 20:22:29] INFO huggingface_loader.py:172: [Not quantized] Parameter: "[1mmodel.layers.15.post_attention_layernorm.weight[0m", shape: (4096,), dtype: float32 |
|
49%|βββββββββββββββββββββββββββββββββββββββββββββ | 96/195 [01:12<01:01, 1.61it/s]
[2024-03-18 20:22:31] INFO huggingface_loader.py:164: [Quantized] Parameter: "[1mmodel.layers.15.mlp.gate_up_proj.q_weight[0m", shape: (22016, 1024), dtype: uint32 |
|
49%|βββββββββββββββββββββββββββββββββββββββββββββ | 96/195 [01:15<01:01, 1.61it/s]
[2024-03-18 20:22:32] INFO huggingface_loader.py:164: [Quantized] Parameter: "[1mmodel.layers.15.mlp.gate_up_proj.q_scale[0m", shape: (22016, 128), dtype: float32 |
|
49%|βββββββββββββββββββββββββββββββββββββββββββββ | 96/195 [01:15<01:01, 1.61it/s]
50%|ββββββββββββββββββββββββββββββββββββββββββββββ | 98/195 [01:15<01:33, 1.04it/s]
[2024-03-18 20:22:32] INFO huggingface_loader.py:164: [Quantized] Parameter: "[1mmodel.layers.15.mlp.down_proj.q_weight[0m", shape: (4096, 2752), dtype: uint32 |
|
50%|ββββββββββββββββββββββββββββββββββββββββββββββ | 98/195 [01:16<01:33, 1.04it/s]
[2024-03-18 20:22:33] INFO huggingface_loader.py:164: [Quantized] Parameter: "[1mmodel.layers.15.mlp.down_proj.q_scale[0m", shape: (4096, 344), dtype: float32 |
|
50%|ββββββββββββββββββββββββββββββββββββββββββββββ | 98/195 [01:16<01:33, 1.04it/s]
51%|βββββββββββββββββββββββββββββββββββββββββββββββ | 99/195 [01:16<01:29, 1.07it/s]
[2024-03-18 20:22:33] INFO huggingface_loader.py:172: [Not quantized] Parameter: "[1mmodel.layers.16.input_layernorm.weight[0m", shape: (4096,), dtype: float32 |
|
51%|βββββββββββββββββββββββββββββββββββββββββββββββ | 99/195 [01:16<01:29, 1.07it/s]
[2024-03-18 20:22:33] INFO huggingface_loader.py:164: [Quantized] Parameter: "[1mmodel.layers.16.self_attn.qkv_proj.q_weight[0m", shape: (12288, 1024), dtype: uint32 |
|
51%|βββββββββββββββββββββββββββββββββββββββββββββββ | 99/195 [01:17<01:29, 1.07it/s]
[2024-03-18 20:22:33] INFO huggingface_loader.py:164: [Quantized] Parameter: "[1mmodel.layers.16.self_attn.qkv_proj.q_scale[0m", shape: (12288, 128), dtype: float32 |
|
51%|βββββββββββββββββββββββββββββββββββββββββββββββ | 99/195 [01:17<01:29, 1.07it/s]
52%|βββββββββββββββββββββββββββββββββββββββββββββββ | 101/195 [01:17<01:11, 1.32it/s]
[2024-03-18 20:22:34] INFO huggingface_loader.py:164: [Quantized] Parameter: "[1mmodel.layers.16.self_attn.o_proj.q_weight[0m", shape: (4096, 1024), dtype: uint32 |
|
52%|βββββββββββββββββββββββββββββββββββββββββββββββ | 101/195 [01:17<01:11, 1.32it/s]
[2024-03-18 20:22:34] INFO huggingface_loader.py:164: [Quantized] Parameter: "[1mmodel.layers.16.self_attn.o_proj.q_scale[0m", shape: (4096, 128), dtype: float32 |
|
52%|βββββββββββββββββββββββββββββββββββββββββββββββ | 101/195 [01:17<01:11, 1.32it/s]
52%|βββββββββββββββββββββββββββββββββββββββββββββββ | 102/195 [01:17<01:02, 1.49it/s]
[2024-03-18 20:22:34] INFO huggingface_loader.py:172: [Not quantized] Parameter: "[1mmodel.layers.16.post_attention_layernorm.weight[0m", shape: (4096,), dtype: float32 |
|
52%|βββββββββββββββββββββββββββββββββββββββββββββββ | 102/195 [01:17<01:02, 1.49it/s]
[2024-03-18 20:22:35] INFO huggingface_loader.py:164: [Quantized] Parameter: "[1mmodel.layers.16.mlp.gate_up_proj.q_weight[0m", shape: (22016, 1024), dtype: uint32 |
|
52%|βββββββββββββββββββββββββββββββββββββββββββββββ | 102/195 [01:19<01:02, 1.49it/s]
[2024-03-18 20:22:36] INFO huggingface_loader.py:164: [Quantized] Parameter: "[1mmodel.layers.16.mlp.gate_up_proj.q_scale[0m", shape: (22016, 128), dtype: float32 |
|
52%|βββββββββββββββββββββββββββββββββββββββββββββββ | 102/195 [01:19<01:02, 1.49it/s]
53%|ββββββββββββββββββββββββββββββββββββββββββββββββ | 104/195 [01:19<01:10, 1.29it/s]
[2024-03-18 20:22:36] INFO huggingface_loader.py:164: [Quantized] Parameter: "[1mmodel.layers.16.mlp.down_proj.q_weight[0m", shape: (4096, 2752), dtype: uint32 |
|
53%|ββββββββββββββββββββββββββββββββββββββββββββββββ | 104/195 [01:20<01:10, 1.29it/s]
[2024-03-18 20:22:37] INFO huggingface_loader.py:164: [Quantized] Parameter: "[1mmodel.layers.16.mlp.down_proj.q_scale[0m", shape: (4096, 344), dtype: float32 |
|
53%|ββββββββββββββββββββββββββββββββββββββββββββββββ | 104/195 [01:20<01:10, 1.29it/s]
54%|βββββββββββββββββββββββββββββββββββββββββββββββββ | 105/195 [01:20<01:10, 1.27it/s]
[2024-03-18 20:22:37] INFO huggingface_loader.py:172: [Not quantized] Parameter: "[1mmodel.layers.17.input_layernorm.weight[0m", shape: (4096,), dtype: float32 |
|
54%|βββββββββββββββββββββββββββββββββββββββββββββββββ | 105/195 [01:20<01:10, 1.27it/s]
[2024-03-18 20:22:37] INFO huggingface_loader.py:164: [Quantized] Parameter: "[1mmodel.layers.17.self_attn.qkv_proj.q_weight[0m", shape: (12288, 1024), dtype: uint32 |
|
54%|βββββββββββββββββββββββββββββββββββββββββββββββββ | 105/195 [01:21<01:10, 1.27it/s]
[2024-03-18 20:22:37] INFO huggingface_loader.py:164: [Quantized] Parameter: "[1mmodel.layers.17.self_attn.qkv_proj.q_scale[0m", shape: (12288, 128), dtype: float32 |
|
54%|βββββββββββββββββββββββββββββββββββββββββββββββββ | 105/195 [01:21<01:10, 1.27it/s]
55%|ββββββββββββββββββββββββββββββββββββββββββββββββββ | 107/195 [01:21<00:58, 1.49it/s]
[2024-03-18 20:22:38] INFO huggingface_loader.py:164: [Quantized] Parameter: "[1mmodel.layers.17.self_attn.o_proj.q_weight[0m", shape: (4096, 1024), dtype: uint32 |
|
55%|ββββββββββββββββββββββββββββββββββββββββββββββββββ | 107/195 [01:21<00:58, 1.49it/s]
[2024-03-18 20:22:38] INFO huggingface_loader.py:164: [Quantized] Parameter: "[1mmodel.layers.17.self_attn.o_proj.q_scale[0m", shape: (4096, 128), dtype: float32 |
|
55%|ββββββββββββββββββββββββββββββββββββββββββββββββββ | 107/195 [01:21<00:58, 1.49it/s]
55%|ββββββββββββββββββββββββββββββββββββββββββββββββββ | 108/195 [01:21<00:51, 1.69it/s]
[2024-03-18 20:22:38] INFO huggingface_loader.py:172: [Not quantized] Parameter: "[1mmodel.layers.17.post_attention_layernorm.weight[0m", shape: (4096,), dtype: float32 |
|
55%|ββββββββββββββββββββββββββββββββββββββββββββββββββ | 108/195 [01:21<00:51, 1.69it/s]
[2024-03-18 20:22:39] INFO huggingface_loader.py:164: [Quantized] Parameter: "[1mmodel.layers.17.mlp.gate_up_proj.q_weight[0m", shape: (22016, 1024), dtype: uint32 |
|
55%|ββββββββββββββββββββββββββββββββββββββββββββββββββ | 108/195 [01:23<00:51, 1.69it/s]
[2024-03-18 20:22:40] INFO huggingface_loader.py:164: [Quantized] Parameter: "[1mmodel.layers.17.mlp.gate_up_proj.q_scale[0m", shape: (22016, 128), dtype: float32 |
|
55%|ββββββββββββββββββββββββββββββββββββββββββββββββββ | 108/195 [01:23<00:51, 1.69it/s]
56%|βββββββββββββββββββββββββββββββββββββββββββββββββββ | 110/195 [01:23<01:00, 1.39it/s]
[2024-03-18 20:22:40] INFO huggingface_loader.py:164: [Quantized] Parameter: "[1mmodel.layers.17.mlp.down_proj.q_weight[0m", shape: (4096, 2752), dtype: uint32 |
|
56%|βββββββββββββββββββββββββββββββββββββββββββββββββββ | 110/195 [01:24<01:00, 1.39it/s]
[2024-03-18 20:22:41] INFO huggingface_loader.py:164: [Quantized] Parameter: "[1mmodel.layers.17.mlp.down_proj.q_scale[0m", shape: (4096, 344), dtype: float32 |
|
56%|βββββββββββββββββββββββββββββββββββββββββββββββββββ | 110/195 [01:24<01:00, 1.39it/s]
57%|ββββββββββββββββββββββββββββββββββββββββββββββββββββ | 111/195 [01:24<01:03, 1.32it/s]
[2024-03-18 20:22:41] INFO huggingface_loader.py:172: [Not quantized] Parameter: "[1mmodel.layers.18.input_layernorm.weight[0m", shape: (4096,), dtype: float32 |
|
57%|ββββββββββββββββββββββββββββββββββββββββββββββββββββ | 111/195 [01:24<01:03, 1.32it/s]
[2024-03-18 20:22:41] INFO huggingface_loader.py:164: [Quantized] Parameter: "[1mmodel.layers.18.self_attn.qkv_proj.q_weight[0m", shape: (12288, 1024), dtype: uint32 |
|
57%|ββββββββββββββββββββββββββββββββββββββββββββββββββββ | 111/195 [01:25<01:03, 1.32it/s]
[2024-03-18 20:22:42] INFO huggingface_loader.py:164: [Quantized] Parameter: "[1mmodel.layers.18.self_attn.qkv_proj.q_scale[0m", shape: (12288, 128), dtype: float32 |
|
57%|ββββββββββββββββββββββββββββββββββββββββββββββββββββ | 111/195 [01:25<01:03, 1.32it/s]
58%|βββββββββββββββββββββββββββββββββββββββββββββββββββββ | 113/195 [01:25<00:53, 1.52it/s]
[2024-03-18 20:22:42] INFO huggingface_loader.py:164: [Quantized] Parameter: "[1mmodel.layers.18.self_attn.o_proj.q_weight[0m", shape: (4096, 1024), dtype: uint32 |
|
58%|βββββββββββββββββββββββββββββββββββββββββββββββββββββ | 113/195 [01:25<00:53, 1.52it/s]
[2024-03-18 20:22:42] INFO huggingface_loader.py:164: [Quantized] Parameter: "[1mmodel.layers.18.self_attn.o_proj.q_scale[0m", shape: (4096, 128), dtype: float32 |
|
58%|βββββββββββββββββββββββββββββββββββββββββββββββββββββ | 113/195 [01:25<00:53, 1.52it/s]
58%|βββββββββββββββββββββββββββββββββββββββββββββββββββββ | 114/195 [01:25<00:47, 1.70it/s]
[2024-03-18 20:22:42] INFO huggingface_loader.py:172: [Not quantized] Parameter: "[1mmodel.layers.18.post_attention_layernorm.weight[0m", shape: (4096,), dtype: float32 |
|
58%|βββββββββββββββββββββββββββββββββββββββββββββββββββββ | 114/195 [01:25<00:47, 1.70it/s]
[2024-03-18 20:22:43] INFO huggingface_loader.py:164: [Quantized] Parameter: "[1mmodel.layers.18.mlp.gate_up_proj.q_weight[0m", shape: (22016, 1024), dtype: uint32 |
|
58%|βββββββββββββββββββββββββββββββββββββββββββββββββββββ | 114/195 [01:27<00:47, 1.70it/s]
[2024-03-18 20:22:44] INFO huggingface_loader.py:164: [Quantized] Parameter: "[1mmodel.layers.18.mlp.gate_up_proj.q_scale[0m", shape: (22016, 128), dtype: float32 |
|
58%|βββββββββββββββββββββββββββββββββββββββββββββββββββββ | 114/195 [01:27<00:47, 1.70it/s]
59%|ββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 116/195 [01:27<00:58, 1.34it/s]
[2024-03-18 20:22:45] INFO huggingface_loader.py:164: [Quantized] Parameter: "[1mmodel.layers.18.mlp.down_proj.q_weight[0m", shape: (4096, 2752), dtype: uint32 |
|
59%|ββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 116/195 [01:28<00:58, 1.34it/s]
[2024-03-18 20:22:45] INFO huggingface_loader.py:164: [Quantized] Parameter: "[1mmodel.layers.18.mlp.down_proj.q_scale[0m", shape: (4096, 344), dtype: float32 |
|
59%|ββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 116/195 [01:28<00:58, 1.34it/s]
60%|ββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 117/195 [01:28<01:00, 1.28it/s]
[2024-03-18 20:22:45] INFO huggingface_loader.py:172: [Not quantized] Parameter: "[1mmodel.layers.19.input_layernorm.weight[0m", shape: (4096,), dtype: float32 |
|
60%|ββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 117/195 [01:28<01:00, 1.28it/s]
[2024-03-18 20:22:45] INFO huggingface_loader.py:164: [Quantized] Parameter: "[1mmodel.layers.19.self_attn.qkv_proj.q_weight[0m", shape: (12288, 1024), dtype: uint32 |
|
60%|ββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 117/195 [01:29<01:00, 1.28it/s]
[2024-03-18 20:22:46] INFO huggingface_loader.py:164: [Quantized] Parameter: "[1mmodel.layers.19.self_attn.qkv_proj.q_scale[0m", shape: (12288, 128), dtype: float32 |
|
60%|ββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 117/195 [01:29<01:00, 1.28it/s]
61%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 119/195 [01:29<00:50, 1.51it/s]
[2024-03-18 20:22:46] INFO huggingface_loader.py:164: [Quantized] Parameter: "[1mmodel.layers.19.self_attn.o_proj.q_weight[0m", shape: (4096, 1024), dtype: uint32 |
|
61%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 119/195 [01:30<00:50, 1.51it/s]
[2024-03-18 20:22:46] INFO huggingface_loader.py:164: [Quantized] Parameter: "[1mmodel.layers.19.self_attn.o_proj.q_scale[0m", shape: (4096, 128), dtype: float32 |
|
61%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 119/195 [01:30<00:50, 1.51it/s]
62%|ββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 120/195 [01:30<00:43, 1.71it/s]
[2024-03-18 20:22:46] INFO huggingface_loader.py:172: [Not quantized] Parameter: "[1mmodel.layers.19.post_attention_layernorm.weight[0m", shape: (4096,), dtype: float32 |
|
62%|ββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 120/195 [01:30<00:43, 1.71it/s]
[2024-03-18 20:22:47] INFO huggingface_loader.py:164: [Quantized] Parameter: "[1mmodel.layers.19.mlp.gate_up_proj.q_weight[0m", shape: (22016, 1024), dtype: uint32 |
|
62%|ββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 120/195 [01:31<00:43, 1.71it/s]
[2024-03-18 20:22:48] INFO huggingface_loader.py:164: [Quantized] Parameter: "[1mmodel.layers.19.mlp.gate_up_proj.q_scale[0m", shape: (22016, 128), dtype: float32 |
|
62%|ββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 120/195 [01:31<00:43, 1.71it/s]
63%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 122/195 [01:32<00:53, 1.38it/s]
[2024-03-18 20:22:49] INFO huggingface_loader.py:164: [Quantized] Parameter: "[1mmodel.layers.19.mlp.down_proj.q_weight[0m", shape: (4096, 2752), dtype: uint32 |
|
63%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 122/195 [01:32<00:53, 1.38it/s]
[2024-03-18 20:22:49] INFO huggingface_loader.py:164: [Quantized] Parameter: "[1mmodel.layers.19.mlp.down_proj.q_scale[0m", shape: (4096, 344), dtype: float32 |
|
63%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 122/195 [01:32<00:53, 1.38it/s]
63%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 123/195 [01:32<00:54, 1.33it/s]
[2024-03-18 20:22:49] INFO huggingface_loader.py:172: [Not quantized] Parameter: "[1mmodel.layers.20.input_layernorm.weight[0m", shape: (4096,), dtype: float32 |
|
63%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 123/195 [01:32<00:54, 1.33it/s]
[2024-03-18 20:22:49] INFO huggingface_loader.py:164: [Quantized] Parameter: "[1mmodel.layers.20.self_attn.qkv_proj.q_weight[0m", shape: (12288, 1024), dtype: uint32 |
|
63%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 123/195 [01:33<00:54, 1.33it/s]
[2024-03-18 20:22:50] INFO huggingface_loader.py:164: [Quantized] Parameter: "[1mmodel.layers.20.self_attn.qkv_proj.q_scale[0m", shape: (12288, 128), dtype: float32 |
|
63%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 123/195 [01:33<00:54, 1.33it/s]
64%|ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 125/195 [01:33<00:45, 1.54it/s]
[2024-03-18 20:22:50] INFO huggingface_loader.py:164: [Quantized] Parameter: "[1mmodel.layers.20.self_attn.o_proj.q_weight[0m", shape: (4096, 1024), dtype: uint32 |
|
64%|ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 125/195 [01:34<00:45, 1.54it/s]
[2024-03-18 20:22:50] INFO huggingface_loader.py:164: [Quantized] Parameter: "[1mmodel.layers.20.self_attn.o_proj.q_scale[0m", shape: (4096, 128), dtype: float32 |
|
64%|ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 125/195 [01:34<00:45, 1.54it/s]
65%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 126/195 [01:34<00:39, 1.74it/s]
[2024-03-18 20:22:50] INFO huggingface_loader.py:172: [Not quantized] Parameter: "[1mmodel.layers.20.post_attention_layernorm.weight[0m", shape: (4096,), dtype: float32 |
|
65%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 126/195 [01:34<00:39, 1.74it/s]
[2024-03-18 20:22:51] INFO huggingface_loader.py:164: [Quantized] Parameter: "[1mmodel.layers.20.mlp.gate_up_proj.q_weight[0m", shape: (22016, 1024), dtype: uint32 |
|
65%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 126/195 [01:35<00:39, 1.74it/s]
[2024-03-18 20:22:52] INFO huggingface_loader.py:164: [Quantized] Parameter: "[1mmodel.layers.20.mlp.gate_up_proj.q_scale[0m", shape: (22016, 128), dtype: float32 |
|
65%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 126/195 [01:35<00:39, 1.74it/s]
66%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 128/195 [01:35<00:47, 1.42it/s]
[2024-03-18 20:22:53] INFO huggingface_loader.py:164: [Quantized] Parameter: "[1mmodel.layers.20.mlp.down_proj.q_weight[0m", shape: (4096, 2752), dtype: uint32 |
|
66%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 128/195 [01:36<00:47, 1.42it/s]
[2024-03-18 20:22:53] INFO huggingface_loader.py:164: [Quantized] Parameter: "[1mmodel.layers.20.mlp.down_proj.q_scale[0m", shape: (4096, 344), dtype: float32 |
|
66%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 128/195 [01:36<00:47, 1.42it/s]
66%|ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 129/195 [01:36<00:49, 1.34it/s]
[2024-03-18 20:22:53] INFO huggingface_loader.py:172: [Not quantized] Parameter: "[1mmodel.layers.21.input_layernorm.weight[0m", shape: (4096,), dtype: float32 |
|
66%|ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 129/195 [01:36<00:49, 1.34it/s]
[2024-03-18 20:22:54] INFO huggingface_loader.py:164: [Quantized] Parameter: "[1mmodel.layers.21.self_attn.qkv_proj.q_weight[0m", shape: (12288, 1024), dtype: uint32 |
|
66%|ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 129/195 [01:37<00:49, 1.34it/s]
[2024-03-18 20:22:54] INFO huggingface_loader.py:164: [Quantized] Parameter: "[1mmodel.layers.21.self_attn.qkv_proj.q_scale[0m", shape: (12288, 128), dtype: float32 |
|
66%|ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 129/195 [01:37<00:49, 1.34it/s]
67%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 131/195 [01:37<00:43, 1.48it/s]
[2024-03-18 20:22:54] INFO huggingface_loader.py:164: [Quantized] Parameter: "[1mmodel.layers.21.self_attn.o_proj.q_weight[0m", shape: (4096, 1024), dtype: uint32 |
|
67%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 131/195 [01:38<00:43, 1.48it/s]
[2024-03-18 20:22:54] INFO huggingface_loader.py:164: [Quantized] Parameter: "[1mmodel.layers.21.self_attn.o_proj.q_scale[0m", shape: (4096, 128), dtype: float32 |
|
67%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 131/195 [01:38<00:43, 1.48it/s]
68%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 132/195 [01:38<00:37, 1.66it/s]
[2024-03-18 20:22:54] INFO huggingface_loader.py:172: [Not quantized] Parameter: "[1mmodel.layers.21.post_attention_layernorm.weight[0m", shape: (4096,), dtype: float32 |
|
68%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 132/195 [01:38<00:37, 1.66it/s]
[2024-03-18 20:22:56] INFO huggingface_loader.py:164: [Quantized] Parameter: "[1mmodel.layers.21.mlp.gate_up_proj.q_weight[0m", shape: (22016, 1024), dtype: uint32 |
|
68%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 132/195 [01:39<00:37, 1.66it/s]
[2024-03-18 20:22:56] INFO huggingface_loader.py:164: [Quantized] Parameter: "[1mmodel.layers.21.mlp.gate_up_proj.q_scale[0m", shape: (22016, 128), dtype: float32 |
|
68%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 132/195 [01:40<00:37, 1.66it/s]
69%|ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 134/195 [01:40<00:45, 1.35it/s]
[2024-03-18 20:22:57] INFO huggingface_loader.py:164: [Quantized] Parameter: "[1mmodel.layers.21.mlp.down_proj.q_weight[0m", shape: (4096, 2752), dtype: uint32 |
|
69%|ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 134/195 [01:40<00:45, 1.35it/s]
[2024-03-18 20:22:57] INFO huggingface_loader.py:164: [Quantized] Parameter: "[1mmodel.layers.21.mlp.down_proj.q_scale[0m", shape: (4096, 344), dtype: float32 |
|
69%|ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 134/195 [01:41<00:45, 1.35it/s]
69%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 135/195 [01:41<00:45, 1.31it/s]
[2024-03-18 20:22:57] INFO huggingface_loader.py:172: [Not quantized] Parameter: "[1mmodel.layers.22.input_layernorm.weight[0m", shape: (4096,), dtype: float32 |
|
69%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 135/195 [01:41<00:45, 1.31it/s]
[2024-03-18 20:22:58] INFO huggingface_loader.py:164: [Quantized] Parameter: "[1mmodel.layers.22.self_attn.qkv_proj.q_weight[0m", shape: (12288, 1024), dtype: uint32 |
|
69%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 135/195 [01:41<00:45, 1.31it/s]
[2024-03-18 20:22:58] INFO huggingface_loader.py:164: [Quantized] Parameter: "[1mmodel.layers.22.self_attn.qkv_proj.q_scale[0m", shape: (12288, 128), dtype: float32 |
|
69%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 135/195 [01:42<00:45, 1.31it/s]
70%|ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 137/195 [01:42<00:38, 1.53it/s]
[2024-03-18 20:22:58] INFO huggingface_loader.py:164: [Quantized] Parameter: "[1mmodel.layers.22.self_attn.o_proj.q_weight[0m", shape: (4096, 1024), dtype: uint32 |
|
70%|ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 137/195 [01:42<00:38, 1.53it/s]
[2024-03-18 20:22:58] INFO huggingface_loader.py:164: [Quantized] Parameter: "[1mmodel.layers.22.self_attn.o_proj.q_scale[0m", shape: (4096, 128), dtype: float32 |
|
70%|ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 137/195 [01:42<00:38, 1.53it/s]
71%|ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 138/195 [01:42<00:33, 1.72it/s]
[2024-03-18 20:22:58] INFO huggingface_loader.py:172: [Not quantized] Parameter: "[1mmodel.layers.22.post_attention_layernorm.weight[0m", shape: (4096,), dtype: float32 |
|
71%|ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 138/195 [01:42<00:33, 1.72it/s]
[2024-03-18 20:23:00] INFO huggingface_loader.py:164: [Quantized] Parameter: "[1mmodel.layers.22.mlp.gate_up_proj.q_weight[0m", shape: (22016, 1024), dtype: uint32 |
|
71%|ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 138/195 [01:43<00:33, 1.72it/s]
[2024-03-18 20:23:00] INFO huggingface_loader.py:164: [Quantized] Parameter: "[1mmodel.layers.22.mlp.gate_up_proj.q_scale[0m", shape: (22016, 128), dtype: float32 |
|
71%|ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 138/195 [01:44<00:33, 1.72it/s]
72%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 140/195 [01:44<00:40, 1.37it/s]
[2024-03-18 20:23:01] INFO huggingface_loader.py:164: [Quantized] Parameter: "[1mmodel.layers.22.mlp.down_proj.q_weight[0m", shape: (4096, 2752), dtype: uint32 |
|
72%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 140/195 [01:44<00:40, 1.37it/s]
[2024-03-18 20:23:01] INFO huggingface_loader.py:164: [Quantized] Parameter: "[1mmodel.layers.22.mlp.down_proj.q_scale[0m", shape: (4096, 344), dtype: float32 |
|
72%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 140/195 [01:45<00:40, 1.37it/s]
72%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 141/195 [01:45<00:41, 1.30it/s]
[2024-03-18 20:23:01] INFO huggingface_loader.py:172: [Not quantized] Parameter: "[1mmodel.layers.23.input_layernorm.weight[0m", shape: (4096,), dtype: float32 |
|
72%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 141/195 [01:45<00:41, 1.30it/s]
[2024-03-18 20:23:02] INFO huggingface_loader.py:164: [Quantized] Parameter: "[1mmodel.layers.23.self_attn.qkv_proj.q_weight[0m", shape: (12288, 1024), dtype: uint32 |
|
72%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 141/195 [01:45<00:41, 1.30it/s]
[2024-03-18 20:23:02] INFO huggingface_loader.py:164: [Quantized] Parameter: "[1mmodel.layers.23.self_attn.qkv_proj.q_scale[0m", shape: (12288, 128), dtype: float32 |
|
72%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 141/195 [01:46<00:41, 1.30it/s]
73%|ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 143/195 [01:46<00:34, 1.51it/s]
[2024-03-18 20:23:02] INFO huggingface_loader.py:164: [Quantized] Parameter: "[1mmodel.layers.23.self_attn.o_proj.q_weight[0m", shape: (4096, 1024), dtype: uint32 |
|
73%|ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 143/195 [01:46<00:34, 1.51it/s]
[2024-03-18 20:23:02] INFO huggingface_loader.py:164: [Quantized] Parameter: "[1mmodel.layers.23.self_attn.o_proj.q_scale[0m", shape: (4096, 128), dtype: float32 |
|
73%|ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 143/195 [01:46<00:34, 1.51it/s]
74%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 144/195 [01:46<00:29, 1.71it/s]
[2024-03-18 20:23:02] INFO huggingface_loader.py:172: [Not quantized] Parameter: "[1mmodel.layers.23.post_attention_layernorm.weight[0m", shape: (4096,), dtype: float32 |
|
74%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 144/195 [01:46<00:29, 1.71it/s]
[2024-03-18 20:23:04] INFO huggingface_loader.py:164: [Quantized] Parameter: "[1mmodel.layers.23.mlp.gate_up_proj.q_weight[0m", shape: (22016, 1024), dtype: uint32 |
|
74%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 144/195 [01:47<00:29, 1.71it/s]
[2024-03-18 20:23:04] INFO huggingface_loader.py:164: [Quantized] Parameter: "[1mmodel.layers.23.mlp.gate_up_proj.q_scale[0m", shape: (22016, 128), dtype: float32 |
|
74%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 144/195 [01:48<00:29, 1.71it/s]
75%|ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 146/195 [01:48<00:36, 1.36it/s]
[2024-03-18 20:23:05] INFO huggingface_loader.py:164: [Quantized] Parameter: "[1mmodel.layers.23.mlp.down_proj.q_weight[0m", shape: (4096, 2752), dtype: uint32 |
|
75%|ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 146/195 [01:49<00:36, 1.36it/s]
[2024-03-18 20:23:05] INFO huggingface_loader.py:164: [Quantized] Parameter: "[1mmodel.layers.23.mlp.down_proj.q_scale[0m", shape: (4096, 344), dtype: float32 |
|
75%|ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 146/195 [01:49<00:36, 1.36it/s]
75%|ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 147/195 [01:49<00:36, 1.30it/s]
[2024-03-18 20:23:05] INFO huggingface_loader.py:172: [Not quantized] Parameter: "[1mmodel.layers.24.input_layernorm.weight[0m", shape: (4096,), dtype: float32 |
|
75%|ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 147/195 [01:49<00:36, 1.30it/s]
[2024-03-18 20:23:06] INFO huggingface_loader.py:164: [Quantized] Parameter: "[1mmodel.layers.24.self_attn.qkv_proj.q_weight[0m", shape: (12288, 1024), dtype: uint32 |
|
75%|ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 147/195 [01:50<00:36, 1.30it/s]
[2024-03-18 20:23:06] INFO huggingface_loader.py:164: [Quantized] Parameter: "[1mmodel.layers.24.self_attn.qkv_proj.q_scale[0m", shape: (12288, 128), dtype: float32 |
|
75%|ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 147/195 [01:50<00:36, 1.30it/s]
76%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 149/195 [01:50<00:31, 1.48it/s]
[2024-03-18 20:23:07] INFO huggingface_loader.py:164: [Quantized] Parameter: "[1mmodel.layers.24.self_attn.o_proj.q_weight[0m", shape: (4096, 1024), dtype: uint32 |
|
76%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 149/195 [01:50<00:31, 1.48it/s]
[2024-03-18 20:23:07] INFO huggingface_loader.py:164: [Quantized] Parameter: "[1mmodel.layers.24.self_attn.o_proj.q_scale[0m", shape: (4096, 128), dtype: float32 |
|
76%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 149/195 [01:50<00:31, 1.48it/s]
77%|ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 150/195 [01:50<00:26, 1.68it/s]
[2024-03-18 20:23:07] INFO huggingface_loader.py:172: [Not quantized] Parameter: "[1mmodel.layers.24.post_attention_layernorm.weight[0m", shape: (4096,), dtype: float32 |
|
77%|ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 150/195 [01:50<00:26, 1.68it/s]
[2024-03-18 20:23:09] INFO huggingface_loader.py:164: [Quantized] Parameter: "[1mmodel.layers.24.mlp.gate_up_proj.q_weight[0m", shape: (22016, 1024), dtype: uint32 |
|
77%|ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 150/195 [01:52<00:26, 1.68it/s]
[2024-03-18 20:23:09] INFO huggingface_loader.py:164: [Quantized] Parameter: "[1mmodel.layers.24.mlp.gate_up_proj.q_scale[0m", shape: (22016, 128), dtype: float32 |
|
77%|ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 150/195 [01:53<00:26, 1.68it/s]
78%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 152/195 [01:53<00:36, 1.17it/s]
[2024-03-18 20:23:10] INFO huggingface_loader.py:164: [Quantized] Parameter: "[1mmodel.layers.24.mlp.down_proj.q_weight[0m", shape: (4096, 2752), dtype: uint32 |
|
78%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 152/195 [01:54<00:36, 1.17it/s]
[2024-03-18 20:23:10] INFO huggingface_loader.py:164: [Quantized] Parameter: "[1mmodel.layers.24.mlp.down_proj.q_scale[0m", shape: (4096, 344), dtype: float32 |
|
78%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 152/195 [01:54<00:36, 1.17it/s]
78%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 153/195 [01:54<00:37, 1.13it/s]
[2024-03-18 20:23:10] INFO huggingface_loader.py:172: [Not quantized] Parameter: "[1mmodel.layers.25.input_layernorm.weight[0m", shape: (4096,), dtype: float32 |
|
78%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 153/195 [01:54<00:37, 1.13it/s]
[2024-03-18 20:23:11] INFO huggingface_loader.py:164: [Quantized] Parameter: "[1mmodel.layers.25.self_attn.qkv_proj.q_weight[0m", shape: (12288, 1024), dtype: uint32 |
|
78%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 153/195 [01:55<00:37, 1.13it/s]
[2024-03-18 20:23:11] INFO huggingface_loader.py:164: [Quantized] Parameter: "[1mmodel.layers.25.self_attn.qkv_proj.q_scale[0m", shape: (12288, 128), dtype: float32 |
|
78%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 153/195 [01:55<00:37, 1.13it/s]
79%|ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 155/195 [01:55<00:29, 1.34it/s]
[2024-03-18 20:23:11] INFO huggingface_loader.py:164: [Quantized] Parameter: "[1mmodel.layers.25.self_attn.o_proj.q_weight[0m", shape: (4096, 1024), dtype: uint32 |
|
79%|ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 155/195 [01:55<00:29, 1.34it/s]
[2024-03-18 20:23:12] INFO huggingface_loader.py:164: [Quantized] Parameter: "[1mmodel.layers.25.self_attn.o_proj.q_scale[0m", shape: (4096, 128), dtype: float32 |
|
79%|ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 155/195 [01:55<00:29, 1.34it/s]
80%|ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 156/195 [01:55<00:25, 1.52it/s]
[2024-03-18 20:23:12] INFO huggingface_loader.py:172: [Not quantized] Parameter: "[1mmodel.layers.25.post_attention_layernorm.weight[0m", shape: (4096,), dtype: float32 |
|
80%|ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 156/195 [01:55<00:25, 1.52it/s]
[2024-03-18 20:23:13] INFO huggingface_loader.py:164: [Quantized] Parameter: "[1mmodel.layers.25.mlp.gate_up_proj.q_weight[0m", shape: (22016, 1024), dtype: uint32 |
|
80%|ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 156/195 [01:57<00:25, 1.52it/s]
[2024-03-18 20:23:14] INFO huggingface_loader.py:164: [Quantized] Parameter: "[1mmodel.layers.25.mlp.gate_up_proj.q_scale[0m", shape: (22016, 128), dtype: float32 |
|
80%|ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 156/195 [01:57<00:25, 1.52it/s]
81%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 158/195 [01:57<00:29, 1.24it/s]
[2024-03-18 20:23:14] INFO huggingface_loader.py:164: [Quantized] Parameter: "[1mmodel.layers.25.mlp.down_proj.q_weight[0m", shape: (4096, 2752), dtype: uint32 |
|
81%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 158/195 [01:58<00:29, 1.24it/s]
[2024-03-18 20:23:15] INFO huggingface_loader.py:164: [Quantized] Parameter: "[1mmodel.layers.25.mlp.down_proj.q_scale[0m", shape: (4096, 344), dtype: float32 |
|
81%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 158/195 [01:58<00:29, 1.24it/s]
82%|ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 159/195 [01:58<00:29, 1.21it/s]
[2024-03-18 20:23:15] INFO huggingface_loader.py:172: [Not quantized] Parameter: "[1mmodel.layers.26.input_layernorm.weight[0m", shape: (4096,), dtype: float32 |
|
82%|ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 159/195 [01:58<00:29, 1.21it/s]
[2024-03-18 20:23:15] INFO huggingface_loader.py:164: [Quantized] Parameter: "[1mmodel.layers.26.self_attn.qkv_proj.q_weight[0m", shape: (12288, 1024), dtype: uint32 |
|
82%|ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 159/195 [01:59<00:29, 1.21it/s]
[2024-03-18 20:23:15] INFO huggingface_loader.py:164: [Quantized] Parameter: "[1mmodel.layers.26.self_attn.qkv_proj.q_scale[0m", shape: (12288, 128), dtype: float32 |
|
82%|ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 159/195 [01:59<00:29, 1.21it/s]
83%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 161/195 [01:59<00:23, 1.44it/s]
[2024-03-18 20:23:16] INFO huggingface_loader.py:164: [Quantized] Parameter: "[1mmodel.layers.26.self_attn.o_proj.q_weight[0m", shape: (4096, 1024), dtype: uint32 |
|
83%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 161/195 [01:59<00:23, 1.44it/s]
[2024-03-18 20:23:16] INFO huggingface_loader.py:164: [Quantized] Parameter: "[1mmodel.layers.26.self_attn.o_proj.q_scale[0m", shape: (4096, 128), dtype: float32 |
|
83%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 161/195 [01:59<00:23, 1.44it/s]
83%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 162/195 [01:59<00:20, 1.63it/s]
[2024-03-18 20:23:16] INFO huggingface_loader.py:172: [Not quantized] Parameter: "[1mmodel.layers.26.post_attention_layernorm.weight[0m", shape: (4096,), dtype: float32 |
|
83%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 162/195 [01:59<00:20, 1.63it/s]
[2024-03-18 20:23:17] INFO huggingface_loader.py:164: [Quantized] Parameter: "[1mmodel.layers.26.mlp.gate_up_proj.q_weight[0m", shape: (22016, 1024), dtype: uint32 |
|
83%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 162/195 [02:01<00:20, 1.63it/s]
[2024-03-18 20:23:18] INFO huggingface_loader.py:164: [Quantized] Parameter: "[1mmodel.layers.26.mlp.gate_up_proj.q_scale[0m", shape: (22016, 128), dtype: float32 |
|
83%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 162/195 [02:01<00:20, 1.63it/s]
84%|ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 164/195 [02:01<00:24, 1.28it/s]
[2024-03-18 20:23:19] INFO huggingface_loader.py:164: [Quantized] Parameter: "[1mmodel.layers.26.mlp.down_proj.q_weight[0m", shape: (4096, 2752), dtype: uint32 |
|
84%|ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 164/195 [02:02<00:24, 1.28it/s]
[2024-03-18 20:23:19] INFO huggingface_loader.py:164: [Quantized] Parameter: "[1mmodel.layers.26.mlp.down_proj.q_scale[0m", shape: (4096, 344), dtype: float32 |
|
84%|ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 164/195 [02:02<00:24, 1.28it/s]
85%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 165/195 [02:02<00:23, 1.26it/s]
[2024-03-18 20:23:19] INFO huggingface_loader.py:172: [Not quantized] Parameter: "[1mmodel.layers.27.input_layernorm.weight[0m", shape: (4096,), dtype: float32 |
|
85%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 165/195 [02:02<00:23, 1.26it/s]
[2024-03-18 20:23:19] INFO huggingface_loader.py:164: [Quantized] Parameter: "[1mmodel.layers.27.self_attn.qkv_proj.q_weight[0m", shape: (12288, 1024), dtype: uint32 |
|
85%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 165/195 [02:03<00:23, 1.26it/s]
[2024-03-18 20:23:20] INFO huggingface_loader.py:164: [Quantized] Parameter: "[1mmodel.layers.27.self_attn.qkv_proj.q_scale[0m", shape: (12288, 128), dtype: float32 |
|
85%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 165/195 [02:03<00:23, 1.26it/s]
86%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 167/195 [02:03<00:18, 1.49it/s]
[2024-03-18 20:23:20] INFO huggingface_loader.py:164: [Quantized] Parameter: "[1mmodel.layers.27.self_attn.o_proj.q_weight[0m", shape: (4096, 1024), dtype: uint32 |
|
86%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 167/195 [02:04<00:18, 1.49it/s]
[2024-03-18 20:23:20] INFO huggingface_loader.py:164: [Quantized] Parameter: "[1mmodel.layers.27.self_attn.o_proj.q_scale[0m", shape: (4096, 128), dtype: float32 |
|
86%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 167/195 [02:04<00:18, 1.49it/s]
86%|ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 168/195 [02:04<00:15, 1.69it/s]
[2024-03-18 20:23:20] INFO huggingface_loader.py:172: [Not quantized] Parameter: "[1mmodel.layers.27.post_attention_layernorm.weight[0m", shape: (4096,), dtype: float32 |
|
86%|ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 168/195 [02:04<00:15, 1.69it/s]
[2024-03-18 20:23:21] INFO huggingface_loader.py:164: [Quantized] Parameter: "[1mmodel.layers.27.mlp.gate_up_proj.q_weight[0m", shape: (22016, 1024), dtype: uint32 |
|
86%|ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 168/195 [02:05<00:15, 1.69it/s]
[2024-03-18 20:23:22] INFO huggingface_loader.py:164: [Quantized] Parameter: "[1mmodel.layers.27.mlp.gate_up_proj.q_scale[0m", shape: (22016, 128), dtype: float32 |
|
86%|ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 168/195 [02:05<00:15, 1.69it/s]
87%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 170/195 [02:05<00:17, 1.41it/s]
[2024-03-18 20:23:22] INFO huggingface_loader.py:164: [Quantized] Parameter: "[1mmodel.layers.27.mlp.down_proj.q_weight[0m", shape: (4096, 2752), dtype: uint32 |
|
87%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 170/195 [02:06<00:17, 1.41it/s]
[2024-03-18 20:23:23] INFO huggingface_loader.py:164: [Quantized] Parameter: "[1mmodel.layers.27.mlp.down_proj.q_scale[0m", shape: (4096, 344), dtype: float32 |
|
87%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 170/195 [02:06<00:17, 1.41it/s]
88%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 171/195 [02:06<00:17, 1.36it/s]
[2024-03-18 20:23:23] INFO huggingface_loader.py:172: [Not quantized] Parameter: "[1mmodel.layers.28.input_layernorm.weight[0m", shape: (4096,), dtype: float32 |
|
88%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 171/195 [02:06<00:17, 1.36it/s]
[2024-03-18 20:23:23] INFO huggingface_loader.py:164: [Quantized] Parameter: "[1mmodel.layers.28.self_attn.qkv_proj.q_weight[0m", shape: (12288, 1024), dtype: uint32 |
|
88%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 171/195 [02:07<00:17, 1.36it/s]
[2024-03-18 20:23:24] INFO huggingface_loader.py:164: [Quantized] Parameter: "[1mmodel.layers.28.self_attn.qkv_proj.q_scale[0m", shape: (12288, 128), dtype: float32 |
|
88%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 171/195 [02:07<00:17, 1.36it/s]
89%|ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 173/195 [02:07<00:14, 1.55it/s]
[2024-03-18 20:23:24] INFO huggingface_loader.py:164: [Quantized] Parameter: "[1mmodel.layers.28.self_attn.o_proj.q_weight[0m", shape: (4096, 1024), dtype: uint32 |
|
89%|ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 173/195 [02:07<00:14, 1.55it/s]
[2024-03-18 20:23:24] INFO huggingface_loader.py:164: [Quantized] Parameter: "[1mmodel.layers.28.self_attn.o_proj.q_scale[0m", shape: (4096, 128), dtype: float32 |
|
89%|ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 173/195 [02:07<00:14, 1.55it/s]
89%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 174/195 [02:08<00:12, 1.75it/s]
[2024-03-18 20:23:24] INFO huggingface_loader.py:172: [Not quantized] Parameter: "[1mmodel.layers.28.post_attention_layernorm.weight[0m", shape: (4096,), dtype: float32 |
|
89%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 174/195 [02:08<00:12, 1.75it/s]
[2024-03-18 20:23:25] INFO huggingface_loader.py:164: [Quantized] Parameter: "[1mmodel.layers.28.mlp.gate_up_proj.q_weight[0m", shape: (22016, 1024), dtype: uint32 |
|
89%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 174/195 [02:09<00:12, 1.75it/s]
[2024-03-18 20:23:26] INFO huggingface_loader.py:164: [Quantized] Parameter: "[1mmodel.layers.28.mlp.gate_up_proj.q_scale[0m", shape: (22016, 128), dtype: float32 |
|
89%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 174/195 [02:09<00:12, 1.75it/s]
90%|ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 176/195 [02:09<00:13, 1.37it/s]
[2024-03-18 20:23:26] INFO huggingface_loader.py:164: [Quantized] Parameter: "[1mmodel.layers.28.mlp.down_proj.q_weight[0m", shape: (4096, 2752), dtype: uint32 |
|
90%|ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 176/195 [02:10<00:13, 1.37it/s]
[2024-03-18 20:23:27] INFO huggingface_loader.py:164: [Quantized] Parameter: "[1mmodel.layers.28.mlp.down_proj.q_scale[0m", shape: (4096, 344), dtype: float32 |
|
90%|ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 176/195 [02:10<00:13, 1.37it/s]
91%|ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 177/195 [02:10<00:13, 1.34it/s]
[2024-03-18 20:23:27] INFO huggingface_loader.py:172: [Not quantized] Parameter: "[1mmodel.layers.29.input_layernorm.weight[0m", shape: (4096,), dtype: float32 |
|
91%|ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 177/195 [02:10<00:13, 1.34it/s]
[2024-03-18 20:23:27] INFO huggingface_loader.py:164: [Quantized] Parameter: "[1mmodel.layers.29.self_attn.qkv_proj.q_weight[0m", shape: (12288, 1024), dtype: uint32 |
|
91%|ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 177/195 [02:11<00:13, 1.34it/s]
[2024-03-18 20:23:28] INFO huggingface_loader.py:164: [Quantized] Parameter: "[1mmodel.layers.29.self_attn.qkv_proj.q_scale[0m", shape: (12288, 128), dtype: float32 |
|
91%|ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 177/195 [02:11<00:13, 1.34it/s]
92%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 179/195 [02:11<00:10, 1.56it/s]
[2024-03-18 20:23:28] INFO huggingface_loader.py:164: [Quantized] Parameter: "[1mmodel.layers.29.self_attn.o_proj.q_weight[0m", shape: (4096, 1024), dtype: uint32 |
|
92%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 179/195 [02:11<00:10, 1.56it/s]
[2024-03-18 20:23:28] INFO huggingface_loader.py:164: [Quantized] Parameter: "[1mmodel.layers.29.self_attn.o_proj.q_scale[0m", shape: (4096, 128), dtype: float32 |
|
92%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 179/195 [02:12<00:10, 1.56it/s]
92%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 180/195 [02:12<00:08, 1.76it/s]
[2024-03-18 20:23:28] INFO huggingface_loader.py:172: [Not quantized] Parameter: "[1mmodel.layers.29.post_attention_layernorm.weight[0m", shape: (4096,), dtype: float32 |
|
92%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 180/195 [02:12<00:08, 1.76it/s]
[2024-03-18 20:23:29] INFO huggingface_loader.py:164: [Quantized] Parameter: "[1mmodel.layers.29.mlp.gate_up_proj.q_weight[0m", shape: (22016, 1024), dtype: uint32 |
|
92%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 180/195 [02:13<00:08, 1.76it/s]
[2024-03-18 20:23:30] INFO huggingface_loader.py:164: [Quantized] Parameter: "[1mmodel.layers.29.mlp.gate_up_proj.q_scale[0m", shape: (22016, 128), dtype: float32 |
|
92%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 180/195 [02:13<00:08, 1.76it/s]
93%|ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 182/195 [02:13<00:09, 1.38it/s]
[2024-03-18 20:23:30] INFO huggingface_loader.py:164: [Quantized] Parameter: "[1mmodel.layers.29.mlp.down_proj.q_weight[0m", shape: (4096, 2752), dtype: uint32 |
|
93%|ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 182/195 [02:14<00:09, 1.38it/s]
[2024-03-18 20:23:31] INFO huggingface_loader.py:164: [Quantized] Parameter: "[1mmodel.layers.29.mlp.down_proj.q_scale[0m", shape: (4096, 344), dtype: float32 |
|
93%|ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 182/195 [02:14<00:09, 1.38it/s]
94%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 183/195 [02:14<00:08, 1.34it/s]
[2024-03-18 20:23:31] INFO huggingface_loader.py:172: [Not quantized] Parameter: "[1mmodel.layers.30.input_layernorm.weight[0m", shape: (4096,), dtype: float32 |
|
94%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 183/195 [02:14<00:08, 1.34it/s]
[2024-03-18 20:23:31] INFO huggingface_loader.py:164: [Quantized] Parameter: "[1mmodel.layers.30.self_attn.qkv_proj.q_weight[0m", shape: (12288, 1024), dtype: uint32 |
|
94%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 183/195 [02:15<00:08, 1.34it/s]
[2024-03-18 20:23:32] INFO huggingface_loader.py:164: [Quantized] Parameter: "[1mmodel.layers.30.self_attn.qkv_proj.q_scale[0m", shape: (12288, 128), dtype: float32 |
|
94%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 183/195 [02:15<00:08, 1.34it/s]
95%|ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 185/195 [02:15<00:06, 1.54it/s]
[2024-03-18 20:23:32] INFO huggingface_loader.py:164: [Quantized] Parameter: "[1mmodel.layers.30.self_attn.o_proj.q_weight[0m", shape: (4096, 1024), dtype: uint32 |
|
95%|ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 185/195 [02:16<00:06, 1.54it/s]
[2024-03-18 20:23:32] INFO huggingface_loader.py:164: [Quantized] Parameter: "[1mmodel.layers.30.self_attn.o_proj.q_scale[0m", shape: (4096, 128), dtype: float32 |
|
95%|ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 185/195 [02:16<00:06, 1.54it/s]
95%|ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 186/195 [02:16<00:05, 1.74it/s]
[2024-03-18 20:23:32] INFO huggingface_loader.py:172: [Not quantized] Parameter: "[1mmodel.layers.30.post_attention_layernorm.weight[0m", shape: (4096,), dtype: float32 |
|
95%|ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 186/195 [02:16<00:05, 1.74it/s]
[2024-03-18 20:23:34] INFO huggingface_loader.py:164: [Quantized] Parameter: "[1mmodel.layers.30.mlp.gate_up_proj.q_weight[0m", shape: (22016, 1024), dtype: uint32 |
|
95%|ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 186/195 [02:17<00:05, 1.74it/s]
[2024-03-18 20:23:34] INFO huggingface_loader.py:164: [Quantized] Parameter: "[1mmodel.layers.30.mlp.gate_up_proj.q_scale[0m", shape: (22016, 128), dtype: float32 |
|
95%|ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 186/195 [02:18<00:05, 1.74it/s]
96%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 188/195 [02:18<00:05, 1.29it/s]
[2024-03-18 20:23:35] INFO huggingface_loader.py:164: [Quantized] Parameter: "[1mmodel.layers.30.mlp.down_proj.q_weight[0m", shape: (4096, 2752), dtype: uint32 |
|
96%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 188/195 [02:18<00:05, 1.29it/s]
[2024-03-18 20:23:35] INFO huggingface_loader.py:164: [Quantized] Parameter: "[1mmodel.layers.30.mlp.down_proj.q_scale[0m", shape: (4096, 344), dtype: float32 |
|
96%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 188/195 [02:19<00:05, 1.29it/s]
97%|ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 189/195 [02:19<00:04, 1.26it/s]
[2024-03-18 20:23:35] INFO huggingface_loader.py:172: [Not quantized] Parameter: "[1mmodel.layers.31.input_layernorm.weight[0m", shape: (4096,), dtype: float32 |
|
97%|ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 189/195 [02:19<00:04, 1.26it/s]
[2024-03-18 20:23:36] INFO huggingface_loader.py:164: [Quantized] Parameter: "[1mmodel.layers.31.self_attn.qkv_proj.q_weight[0m", shape: (12288, 1024), dtype: uint32 |
|
97%|ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 189/195 [02:19<00:04, 1.26it/s]
[2024-03-18 20:23:36] INFO huggingface_loader.py:164: [Quantized] Parameter: "[1mmodel.layers.31.self_attn.qkv_proj.q_scale[0m", shape: (12288, 128), dtype: float32 |
|
97%|ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 189/195 [02:20<00:04, 1.26it/s]
98%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 191/195 [02:20<00:02, 1.44it/s]
[2024-03-18 20:23:36] INFO huggingface_loader.py:164: [Quantized] Parameter: "[1mmodel.layers.31.self_attn.o_proj.q_weight[0m", shape: (4096, 1024), dtype: uint32 |
|
98%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 191/195 [02:20<00:02, 1.44it/s]
[2024-03-18 20:23:36] INFO huggingface_loader.py:164: [Quantized] Parameter: "[1mmodel.layers.31.self_attn.o_proj.q_scale[0m", shape: (4096, 128), dtype: float32 |
|
98%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 191/195 [02:20<00:02, 1.44it/s]
98%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 192/195 [02:20<00:01, 1.61it/s]
[2024-03-18 20:23:36] INFO huggingface_loader.py:172: [Not quantized] Parameter: "[1mmodel.layers.31.post_attention_layernorm.weight[0m", shape: (4096,), dtype: float32 |
|
98%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 192/195 [02:20<00:01, 1.61it/s]
[2024-03-18 20:23:38] INFO huggingface_loader.py:164: [Quantized] Parameter: "[1mmodel.layers.31.mlp.gate_up_proj.q_weight[0m", shape: (22016, 1024), dtype: uint32 |
|
98%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 192/195 [02:21<00:01, 1.61it/s]
[2024-03-18 20:23:38] INFO huggingface_loader.py:164: [Quantized] Parameter: "[1mmodel.layers.31.mlp.gate_up_proj.q_scale[0m", shape: (22016, 128), dtype: float32 |
|
98%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | 192/195 [02:22<00:01, 1.61it/s]
99%|ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 194/195 [02:22<00:00, 1.33it/s]
[2024-03-18 20:23:39] INFO huggingface_loader.py:164: [Quantized] Parameter: "[1mmodel.layers.31.mlp.down_proj.q_weight[0m", shape: (4096, 2752), dtype: uint32 |
|
99%|ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 194/195 [02:23<00:00, 1.33it/s]
[2024-03-18 20:23:39] INFO huggingface_loader.py:164: [Quantized] Parameter: "[1mmodel.layers.31.mlp.down_proj.q_scale[0m", shape: (4096, 344), dtype: float32 |
|
99%|ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 194/195 [02:23<00:00, 1.33it/s]
100%|ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 195/195 [02:23<00:00, 1.30it/s]
100%|ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 195/195 [02:23<00:00, 1.36it/s] |
|
[2024-03-18 20:23:39] INFO huggingface_loader.py:194: Unloading HF weight file: ../dist/models/XAgentLlama-7B-preview/pytorch_model.bin |
|
[2024-03-18 20:23:40] INFO stats.py:76: [92mTime usage[0m: HF loading: 29.093 sec; Pre-quantization mapping: 29.604 sec; Quantization: 3.236 sec |
|
[2024-03-18 20:23:40] INFO stats.py:90: [92mRAM usage[0m: Peak RAM: 12.552 GB. Total bytes loaded from disk: 12.552 GB |
|
[2024-03-18 20:23:40] INFO convert_weight.py:156: [92mParameter size[0m after quantization: 7.061 GB |
|
[2024-03-18 20:23:40] INFO convert_weight.py:161: [92mTotal parameters[0m: 6,738,546,688 |
|
[2024-03-18 20:23:40] INFO convert_weight.py:162: [92mBits per parameter[0m: 9.001 |
|
[2024-03-18 20:23:40] INFO convert_weight.py:167: Saved to directory: [1m/tmp/tmpyuo2_jf8[0m |
|
|
|
All finished, 132 total shards committed, record saved to /tmp/tmpyuo2_jf8/ndarray-cache.json |
|
Also saved a bf16 record to /tmp/tmpyuo2_jf8/ndarray-cache-b16.json |
|
|