runtime error

Exit code: 1. Reason: l_call>\n{"name": "' }} {{- tool_call.name }} {{- '", "arguments": ' }} {{- tool_call.arguments | tojson }} {{- '}\n</tool_call>' }} {%- endfor %} {{- '<|im_end|>\n' }} {%- elif message.role == "tool" %} {%- if (loop.index0 == 0) or (messages[loop.index0 - 1].role != "tool") %} {{- '<|im_start|>user' }} {%- endif %} {{- '\n<tool_response>\n' }} {{- message.content }} {{- '\n</tool_response>' }} {%- if loop.last or (messages[loop.index0 + 1].role != "tool") %} {{- '<|im_end|>\n' }} {%- endif %} {%- endif %} {%- endfor %} {%- if add_generation_prompt %} {{- '<|im_start|>assistant\n' }} {%- endif %} INFO:gguf.gguf_writer:Writing the following files: INFO:gguf.gguf_writer:OpenThinker-7B-Unverified.gguf: n_tensors = 339, total_size = 15.2G Writing: 0%| | 0.00/15.2G [00:00<?, ?byte/s] Writing: 7%|▋ | 1.09G/15.2G [00:04<00:52, 271Mbyte/s] Writing: 10%|▉ | 1.50G/15.2G [00:05<00:48, 286Mbyte/s] Writing: 13%|█▎ | 1.96G/15.2G [00:06<00:43, 305Mbyte/s] Writing: 16%|█▌ | 2.43G/15.2G [00:07<00:40, 319Mbyte/s] Writing: 19%|█▉ | 2.90G/15.2G [00:09<00:38, 320Mbyte/s]Traceback (most recent call last): File "/home/user/app/llama.cpp/convert_hf_to_gguf.py", line 6237, in <module> main() File "/home/user/app/llama.cpp/convert_hf_to_gguf.py", line 6231, in main model_instance.write() File "/home/user/app/llama.cpp/convert_hf_to_gguf.py", line 406, in write self.gguf_writer.write_tensors_to_file(progress=True) File "/home/user/app/llama.cpp/gguf-py/gguf/gguf_writer.py", line 454, in write_tensors_to_file ti.tensor.tofile(fout) File "/home/user/app/llama.cpp/gguf-py/gguf/lazy.py", line 221, in tofile return eager.tofile(*args, **kwargs) OSError: Not enough free space to write 25690112 bytes Writing: 19%|█▉ | 2.90G/15.2G [00:09<00:41, 300Mbyte/s]

Container logs:

Fetching error logs...