patchtst-tsmixup-two-layer
This model is a fine-tuned version of on an unknown dataset. It achieves the following results on the evaluation set:
- Loss: 0.1497
- Mse: 258.6847
- Mae: 0.6232
- Rmse: 16.0837
- Smape: 70.2567
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 448
- eval_batch_size: 896
- seed: 42
- gradient_accumulation_steps: 2
- total_train_batch_size: 896
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 1000
- num_epochs: 10
Training results
Training Loss | Epoch | Step | Validation Loss | Mse | Mae | Rmse | Smape |
---|---|---|---|---|---|---|---|
0.1786 | 0.1666 | 1000 | 0.1714 | 461.4028 | 0.7236 | 21.4803 | 100.4842 |
0.1661 | 0.3333 | 2000 | 0.1660 | 413.4870 | 0.7017 | 20.3344 | 76.6379 |
0.1675 | 0.4999 | 3000 | 0.1632 | 444.1673 | 0.6908 | 21.0753 | 144.1648 |
0.1641 | 0.6666 | 4000 | 0.1617 | 368.5459 | 0.6717 | 19.1976 | 187.8992 |
0.1631 | 0.8332 | 5000 | 0.1598 | 329.2304 | 0.6614 | 18.1447 | 77.3954 |
0.163 | 0.9998 | 6000 | 0.1594 | 376.1231 | 0.6678 | 19.3939 | 2604.0836 |
0.1631 | 1.1665 | 7000 | 0.1582 | 396.0430 | 0.6601 | 19.9008 | 79.8599 |
0.1604 | 1.3331 | 8000 | 0.1572 | 331.4156 | 0.6539 | 18.2048 | 89.4498 |
0.1552 | 1.4998 | 9000 | 0.1571 | 272.9918 | 0.6467 | 16.5225 | 88.8514 |
0.1569 | 1.6664 | 10000 | 0.1568 | 301.4679 | 0.6509 | 17.3628 | 88.6194 |
0.1613 | 1.8330 | 11000 | 0.1568 | 287.0067 | 0.6522 | 16.9413 | 137.6847 |
0.1564 | 1.9997 | 12000 | 0.1558 | 321.1451 | 0.6502 | 17.9205 | 126.8916 |
0.1572 | 2.1663 | 13000 | 0.1554 | 304.7481 | 0.6503 | 17.4570 | 190.2351 |
0.1565 | 2.3329 | 14000 | 0.1556 | 276.5483 | 0.6449 | 16.6297 | 103.9599 |
0.1571 | 2.4996 | 15000 | 0.1546 | 303.2998 | 0.6458 | 17.4155 | 144.8287 |
0.156 | 2.6662 | 16000 | 0.1544 | 278.1052 | 0.6376 | 16.6765 | 78.1067 |
0.1553 | 2.8329 | 17000 | 0.1543 | 274.6881 | 0.6390 | 16.5737 | 81.9482 |
0.1542 | 2.9995 | 18000 | 0.1547 | 239.9231 | 0.6360 | 15.4895 | 83.6961 |
0.1549 | 3.1661 | 19000 | 0.1541 | 277.2799 | 0.6419 | 16.6517 | 375.4867 |
0.1542 | 3.3328 | 20000 | 0.1542 | 275.0111 | 0.6375 | 16.5835 | 86.7323 |
0.1572 | 3.4994 | 21000 | 0.1536 | 264.1418 | 0.6371 | 16.2524 | 111.1521 |
0.1559 | 3.6661 | 22000 | 0.1539 | 271.3185 | 0.6393 | 16.4717 | 83.0078 |
0.154 | 3.8327 | 23000 | 0.1533 | 253.9782 | 0.6338 | 15.9367 | 96.5289 |
0.1542 | 3.9993 | 24000 | 0.1532 | 267.4779 | 0.6425 | 16.3548 | 68.1349 |
0.1534 | 4.1660 | 25000 | 0.1532 | 262.3679 | 0.6358 | 16.1978 | 83.8336 |
0.1533 | 4.3326 | 26000 | 0.1528 | 317.2105 | 0.6429 | 17.8104 | 85.2472 |
0.1556 | 4.4993 | 27000 | 0.1528 | 266.3440 | 0.6333 | 16.3200 | 116.4548 |
0.1537 | 4.6659 | 28000 | 0.1527 | 259.0167 | 0.6342 | 16.0940 | 91.6244 |
0.1541 | 4.8325 | 29000 | 0.1524 | 281.7036 | 0.6396 | 16.7840 | 75.8411 |
0.1527 | 4.9992 | 30000 | 0.1523 | 304.5508 | 0.6393 | 17.4514 | 86.2233 |
0.1522 | 5.1658 | 31000 | 0.1522 | 261.6904 | 0.6314 | 16.1768 | 77.8058 |
0.1538 | 5.3324 | 32000 | 0.1522 | 284.4175 | 0.6336 | 16.8647 | 97.7625 |
0.1537 | 5.4991 | 33000 | 0.1520 | 309.5190 | 0.6375 | 17.5932 | 134.9614 |
0.1516 | 5.6657 | 34000 | 0.1519 | 252.8892 | 0.6305 | 15.9025 | 119.1109 |
0.1536 | 5.8324 | 35000 | 0.1520 | 293.8005 | 0.6377 | 17.1406 | 84.0213 |
0.1528 | 5.9990 | 36000 | 0.1515 | 291.5611 | 0.6328 | 17.0752 | 353.8475 |
0.1515 | 6.1656 | 37000 | 0.1518 | 254.8325 | 0.6315 | 15.9635 | 85.4300 |
0.1529 | 6.3323 | 38000 | 0.1513 | 254.4357 | 0.6292 | 15.9510 | 112.0245 |
0.1526 | 6.4989 | 39000 | 0.1516 | 265.3687 | 0.6320 | 16.2901 | 86.3409 |
0.1526 | 6.6656 | 40000 | 0.1512 | 254.5356 | 0.6289 | 15.9542 | 87.9753 |
0.1518 | 6.8322 | 41000 | 0.1511 | 233.5401 | 0.6244 | 15.2820 | 122.1964 |
0.1518 | 6.9988 | 42000 | 0.1512 | 249.4746 | 0.6250 | 15.7948 | 86.1394 |
0.1501 | 7.1655 | 43000 | 0.1512 | 285.0164 | 0.6310 | 16.8824 | 94.1700 |
0.1515 | 7.3321 | 44000 | 0.1509 | 266.2695 | 0.6274 | 16.3178 | 96.5646 |
0.1523 | 7.4988 | 45000 | 0.1507 | 256.3644 | 0.6250 | 16.0114 | 223.2191 |
0.1524 | 7.6654 | 46000 | 0.1508 | 269.4569 | 0.6292 | 16.4151 | 88.9256 |
0.1513 | 7.8320 | 47000 | 0.1507 | 247.6273 | 0.6247 | 15.7362 | 77.1014 |
0.1517 | 7.9987 | 48000 | 0.1505 | 251.7547 | 0.6256 | 15.8668 | 83.9576 |
0.1512 | 8.1653 | 49000 | 0.1504 | 245.7318 | 0.6245 | 15.6758 | 79.3953 |
0.1502 | 8.3319 | 50000 | 0.1502 | 275.0424 | 0.6281 | 16.5844 | 90.1375 |
0.1523 | 8.4986 | 51000 | 0.1501 | 252.4250 | 0.6235 | 15.8879 | 89.7133 |
0.1521 | 8.6652 | 52000 | 0.1502 | 247.0445 | 0.6230 | 15.7176 | 78.6296 |
0.1529 | 8.8319 | 53000 | 0.1502 | 258.8629 | 0.6248 | 16.0892 | nan |
0.1511 | 8.9985 | 54000 | 0.1501 | 274.1158 | 0.6279 | 16.5564 | 90.8373 |
0.1489 | 9.1651 | 55000 | 0.1499 | 264.4435 | 0.6254 | 16.2617 | 80.3961 |
0.1511 | 9.3318 | 56000 | 0.1500 | 267.5066 | 0.6259 | 16.3556 | 76.1822 |
0.1536 | 9.4984 | 57000 | 0.1499 | 257.8295 | 0.6236 | 16.0571 | 107.1616 |
0.1511 | 9.6651 | 58000 | 0.1497 | 265.9769 | 0.6247 | 16.3088 | 66.7921 |
0.1521 | 9.8317 | 59000 | 0.1497 | 261.9222 | 0.6244 | 16.1840 | 79.6868 |
0.1471 | 9.9983 | 60000 | 0.1497 | 258.6847 | 0.6232 | 16.0837 | 70.2567 |
Framework versions
- Transformers 4.51.3
- Pytorch 2.7.1+cu126
- Datasets 2.17.1
- Tokenizers 0.21.1
- Downloads last month
- 4
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
๐
Ask for provider support