runtime error
Exit code: 1. Reason: 54%|████████████████████ | 1.56G/2.88G [00:24<00:16, 87.5MiB/s][A 58%|█████████████████████▌ | 1.68G/2.88G [00:25<00:13, 96.7MiB/s][A 62%|██████████████████████▉ | 1.78G/2.88G [00:27<00:12, 94.8MiB/s][A 67%|█████████████████████████▎ | 1.92G/2.88G [00:28<00:09, 109MiB/s][A 71%|███████████████████████████ | 2.05G/2.88G [00:29<00:07, 118MiB/s][A 75%|████████████████████████████▋ | 2.17G/2.88G [00:30<00:06, 120MiB/s][A 81%|██████████████████████████████▌ | 2.32G/2.88G [00:31<00:04, 131MiB/s][A 85%|████████████████████████████████▎ | 2.44G/2.88G [00:32<00:03, 127MiB/s][A 89%|█████████████████████████████████▉ | 2.56G/2.88G [00:33<00:02, 127MiB/s][A 93%|███████████████████████████████████▍ | 2.68G/2.88G [00:34<00:01, 124MiB/s][A 97%|█████████████████████████████████████ | 2.80G/2.88G [00:35<00:00, 121MiB/s][A 100%|█████████████████████████████████████| 2.88G/2.88G [00:36<00:00, 84.0MiB/s] Removing meta-llama/Llama-3.2-1B-Instruct from LLM options since it cannot be loaded. Traceback (most recent call last): File "/home/user/app/app.py", line 515, in <module> start_warmup() File "/home/user/app/app.py", line 296, in start_warmup opt = LLM_options[opt_count] IndexError: list index out of range
Container logs:
Fetching error logs...