patchtst-tsmixup
This model is a fine-tuned version of on an unknown dataset. It achieves the following results on the evaluation set:
- Loss: 0.2566
- Mse: 4643.9019
- Mae: 1.1228
- Rmse: 68.1462
- Smape: nan
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 256
- eval_batch_size: 512
- seed: 42
- gradient_accumulation_steps: 2
- total_train_batch_size: 512
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 1000
- num_epochs: 10
Training results
Training Loss | Epoch | Step | Validation Loss | Mse | Mae | Rmse | Smape |
---|---|---|---|---|---|---|---|
0.3084 | 0.0980 | 1000 | 0.3024 | 5346.5903 | 1.2307 | 73.1204 | 153.0007 |
0.2929 | 0.1960 | 2000 | 0.2897 | 5068.2759 | 1.1988 | 71.1918 | nan |
0.2907 | 0.2940 | 3000 | 0.2848 | 4958.8062 | 1.1856 | 70.4188 | nan |
0.2839 | 0.3920 | 4000 | 0.2819 | 4832.9458 | 1.1795 | 69.5194 | nan |
0.2839 | 0.4901 | 5000 | 0.2792 | 4758.4775 | 1.1657 | 68.9817 | 173.3474 |
0.2799 | 0.5881 | 6000 | 0.2773 | 4732.7681 | 1.1655 | 68.7951 | 154.0682 |
0.2812 | 0.6861 | 7000 | 0.2756 | 4791.2764 | 1.1631 | 69.2190 | inf |
0.2762 | 0.7841 | 8000 | 0.2742 | 4711.9121 | 1.1609 | 68.6434 | 150.8088 |
0.2775 | 0.8821 | 9000 | 0.2745 | 4661.4844 | 1.1561 | 68.2751 | nan |
0.2739 | 0.9801 | 10000 | 0.2726 | 4697.7852 | 1.1541 | 68.5404 | nan |
0.2763 | 1.0781 | 11000 | 0.2718 | 4672.8022 | 1.1511 | 68.3579 | nan |
0.2736 | 1.1761 | 12000 | 0.2712 | 4646.1968 | 1.1528 | 68.1630 | 144.7422 |
0.2727 | 1.2741 | 13000 | 0.2699 | 4704.0283 | 1.1525 | 68.5859 | nan |
0.2705 | 1.3721 | 14000 | 0.2694 | 4732.0859 | 1.1485 | 68.7902 | nan |
0.2718 | 1.4702 | 15000 | 0.2691 | 4672.9360 | 1.1495 | 68.3589 | nan |
0.2714 | 1.5682 | 16000 | 0.2687 | 4653.2021 | 1.1452 | 68.2144 | nan |
0.2721 | 1.6662 | 17000 | 0.2677 | 4653.2695 | 1.1435 | 68.2149 | 263.4695 |
0.2694 | 1.7642 | 18000 | 0.2687 | 4744.7812 | 1.1485 | 68.8824 | nan |
0.2707 | 1.8622 | 19000 | 0.2680 | 4563.2783 | 1.1418 | 67.5520 | nan |
0.2701 | 1.9602 | 20000 | 0.2670 | 4582.1411 | 1.1379 | 67.6915 | nan |
0.2687 | 2.0582 | 21000 | 0.2666 | 4580.3374 | 1.1404 | 67.6782 | 203.8995 |
0.2698 | 2.1562 | 22000 | 0.2663 | 4663.5762 | 1.1435 | 68.2904 | nan |
0.2684 | 2.2542 | 23000 | 0.2663 | 4542.9971 | 1.1381 | 67.4018 | 160.5439 |
0.2692 | 2.3522 | 24000 | 0.2656 | 4628.6294 | 1.1399 | 68.0340 | 143.5725 |
0.2692 | 2.4503 | 25000 | 0.2654 | 4611.9399 | 1.1401 | 67.9113 | nan |
0.2674 | 2.5483 | 26000 | 0.2650 | 4631.6226 | 1.1373 | 68.0560 | nan |
0.2676 | 2.6463 | 27000 | 0.2652 | 4656.6133 | 1.1404 | 68.2394 | 261.2106 |
0.2701 | 2.7443 | 28000 | 0.2645 | 4583.7627 | 1.1386 | 67.7035 | nan |
0.2676 | 2.8423 | 29000 | 0.2645 | 4596.8721 | 1.1331 | 67.8002 | 147.3792 |
0.2671 | 2.9403 | 30000 | 0.2642 | 4660.9785 | 1.1388 | 68.2714 | nan |
0.2666 | 3.0383 | 31000 | 0.2646 | 4729.0498 | 1.1405 | 68.7681 | nan |
0.267 | 3.1363 | 32000 | 0.2638 | 4698.4321 | 1.1409 | 68.5451 | nan |
0.2663 | 3.2343 | 33000 | 0.2641 | 4621.0 | 1.1369 | 67.9779 | nan |
0.2642 | 3.3324 | 34000 | 0.2632 | 4717.3887 | 1.1363 | 68.6832 | nan |
0.2686 | 3.4304 | 35000 | 0.2633 | 4576.2842 | 1.1339 | 67.6482 | nan |
0.2639 | 3.5284 | 36000 | 0.2635 | 4640.9990 | 1.1401 | 68.1249 | 145.8278 |
0.2668 | 3.6264 | 37000 | 0.2631 | 4652.9268 | 1.1371 | 68.2124 | nan |
0.2685 | 3.7244 | 38000 | 0.2627 | 4674.4717 | 1.1338 | 68.3701 | nan |
0.2651 | 3.8224 | 39000 | 0.2626 | 4667.7871 | 1.1355 | 68.3212 | nan |
0.2637 | 3.9204 | 40000 | 0.2626 | 4603.3242 | 1.1337 | 67.8478 | 128.9741 |
0.2655 | 4.0184 | 41000 | 0.2623 | 4671.7549 | 1.1358 | 68.3502 | nan |
0.2657 | 4.1164 | 42000 | 0.2621 | 4639.7461 | 1.1318 | 68.1157 | nan |
0.2637 | 4.2144 | 43000 | 0.2619 | 4658.6704 | 1.1330 | 68.2545 | nan |
0.2649 | 4.3125 | 44000 | 0.2621 | 4717.4502 | 1.1337 | 68.6837 | nan |
0.2651 | 4.4105 | 45000 | 0.2617 | 4616.3667 | 1.1287 | 67.9438 | nan |
0.2648 | 4.5085 | 46000 | 0.2615 | 4641.1528 | 1.1327 | 68.1260 | 164.7345 |
0.2646 | 4.6065 | 47000 | 0.2616 | 4634.9507 | 1.1343 | 68.0805 | nan |
0.2619 | 4.7045 | 48000 | 0.2612 | 4717.5820 | 1.1327 | 68.6847 | nan |
0.2641 | 4.8025 | 49000 | 0.2612 | 4671.6055 | 1.1355 | 68.3491 | nan |
0.264 | 4.9005 | 50000 | 0.2613 | 4625.3916 | 1.1307 | 68.0102 | nan |
0.264 | 4.9985 | 51000 | 0.2607 | 4600.4443 | 1.1309 | 67.8266 | nan |
0.2646 | 5.0965 | 52000 | 0.2607 | 4653.5298 | 1.1327 | 68.2168 | nan |
0.2643 | 5.1946 | 53000 | 0.2604 | 4582.6050 | 1.1316 | 67.6949 | nan |
0.263 | 5.2926 | 54000 | 0.2607 | 4624.2041 | 1.1305 | 68.0015 | nan |
0.264 | 5.3906 | 55000 | 0.2604 | 4654.0234 | 1.1305 | 68.2204 | nan |
0.2621 | 5.4886 | 56000 | 0.2601 | 4626.9565 | 1.1290 | 68.0217 | nan |
0.2641 | 5.5866 | 57000 | 0.2604 | 4636.6865 | 1.1318 | 68.0932 | nan |
0.2649 | 5.6846 | 58000 | 0.2600 | 4662.0747 | 1.1305 | 68.2794 | nan |
0.2637 | 5.7826 | 59000 | 0.2599 | 4631.6445 | 1.1295 | 68.0562 | nan |
0.2632 | 5.8806 | 60000 | 0.2598 | 4632.5400 | 1.1311 | 68.0628 | 146.8735 |
0.263 | 5.9786 | 61000 | 0.2596 | 4634.8896 | 1.1297 | 68.0800 | nan |
0.2626 | 6.0766 | 62000 | 0.2598 | 4677.4688 | 1.1307 | 68.3920 | nan |
0.2623 | 6.1747 | 63000 | 0.2596 | 4674.2075 | 1.1313 | 68.3682 | nan |
0.2646 | 6.2727 | 64000 | 0.2595 | 4665.5918 | 1.1310 | 68.3051 | nan |
0.2631 | 6.3707 | 65000 | 0.2593 | 4672.3618 | 1.1285 | 68.3547 | nan |
0.2623 | 6.4687 | 66000 | 0.2593 | 4666.8711 | 1.1299 | 68.3145 | nan |
0.2636 | 6.5667 | 67000 | 0.2594 | 4603.8647 | 1.1279 | 67.8518 | nan |
0.262 | 6.6647 | 68000 | 0.2590 | 4614.9053 | 1.1276 | 67.9331 | nan |
0.2616 | 6.7627 | 69000 | 0.2591 | 4621.3652 | 1.1286 | 67.9806 | nan |
0.2623 | 6.8607 | 70000 | 0.2587 | 4653.7485 | 1.1297 | 68.2184 | nan |
0.2606 | 6.9587 | 71000 | 0.2588 | 4616.5127 | 1.1265 | 67.9449 | nan |
0.2625 | 7.0567 | 72000 | 0.2588 | 4605.3052 | 1.1267 | 67.8624 | nan |
0.2616 | 7.1548 | 73000 | 0.2586 | 4632.1304 | 1.1258 | 68.0598 | nan |
0.261 | 7.2528 | 74000 | 0.2584 | 4635.8457 | 1.1264 | 68.0870 | nan |
0.2624 | 7.3508 | 75000 | 0.2583 | 4700.9951 | 1.1278 | 68.5638 | nan |
0.2604 | 7.4488 | 76000 | 0.2583 | 4718.2422 | 1.1279 | 68.6895 | nan |
0.2611 | 7.5468 | 77000 | 0.2583 | 4681.1753 | 1.1288 | 68.4191 | nan |
0.2603 | 7.6448 | 78000 | 0.2581 | 4658.5015 | 1.1270 | 68.2532 | nan |
0.263 | 7.7428 | 79000 | 0.2581 | 4655.0098 | 1.1287 | 68.2276 | nan |
0.2593 | 7.8408 | 80000 | 0.2579 | 4674.6558 | 1.1277 | 68.3715 | nan |
0.2616 | 7.9388 | 81000 | 0.2580 | 4662.5923 | 1.1278 | 68.2832 | nan |
0.2623 | 8.0369 | 82000 | 0.2578 | 4622.4644 | 1.1260 | 67.9887 | nan |
0.2599 | 8.1349 | 83000 | 0.2578 | 4693.6475 | 1.1265 | 68.5102 | nan |
0.2606 | 8.2329 | 84000 | 0.2578 | 4664.7837 | 1.1271 | 68.2992 | nan |
0.2604 | 8.3309 | 85000 | 0.2576 | 4674.7075 | 1.1273 | 68.3718 | nan |
0.2605 | 8.4289 | 86000 | 0.2576 | 4615.8818 | 1.1244 | 67.9403 | nan |
0.2598 | 8.5269 | 87000 | 0.2575 | 4681.8462 | 1.1254 | 68.4240 | nan |
0.2606 | 8.6249 | 88000 | 0.2575 | 4662.5674 | 1.1251 | 68.2830 | nan |
0.2602 | 8.7229 | 89000 | 0.2573 | 4656.4746 | 1.1238 | 68.2384 | nan |
0.2588 | 8.8209 | 90000 | 0.2573 | 4647.8911 | 1.1249 | 68.1754 | nan |
0.2617 | 8.9189 | 91000 | 0.2573 | 4655.3872 | 1.1245 | 68.2304 | nan |
0.2617 | 9.0170 | 92000 | 0.2573 | 4648.3506 | 1.1258 | 68.1788 | nan |
0.2601 | 9.1150 | 93000 | 0.2571 | 4630.1865 | 1.1232 | 68.0455 | nan |
0.2598 | 9.2130 | 94000 | 0.2570 | 4677.0566 | 1.1245 | 68.3890 | nan |
0.2609 | 9.3110 | 95000 | 0.2570 | 4628.4395 | 1.1244 | 68.0326 | nan |
0.2596 | 9.4090 | 96000 | 0.2569 | 4638.3027 | 1.1245 | 68.1051 | nan |
0.2588 | 9.5070 | 97000 | 0.2568 | 4655.1597 | 1.1249 | 68.2287 | nan |
0.2613 | 9.6050 | 98000 | 0.2567 | 4639.6914 | 1.1242 | 68.1153 | nan |
0.2593 | 9.7030 | 99000 | 0.2568 | 4654.0952 | 1.1250 | 68.2209 | nan |
0.2593 | 9.8010 | 100000 | 0.2567 | 4644.1465 | 1.1229 | 68.1480 | 165.2788 |
0.261 | 9.8990 | 101000 | 0.2567 | 4641.4990 | 1.1234 | 68.1285 | nan |
0.2587 | 9.9971 | 102000 | 0.2566 | 4643.9019 | 1.1228 | 68.1462 | nan |
Framework versions
- Transformers 4.51.3
- Pytorch 2.7.1+cu126
- Datasets 2.17.1
- Tokenizers 0.21.1
- Downloads last month
- 5
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
๐
Ask for provider support