3d2smiles_finetune

This model is a fine-tuned version of weathon/3d2smiles_pretrain on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 0.4533
  • Accuracy: 0.3446

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0003
  • train_batch_size: 32
  • eval_batch_size: 32
  • seed: 42
  • optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: linear
  • num_epochs: 20
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Accuracy
1.1727 0.1887 10 1.8958 0.0225
0.6846 0.3774 20 1.0548 0.0674
0.7281 0.5660 30 0.8115 0.0899
0.4237 0.7547 40 0.6916 0.1199
0.382 0.9434 50 0.6418 0.1011
0.2166 1.1321 60 0.5568 0.1161
0.4252 1.3208 70 0.5942 0.1161
0.2349 1.5094 80 0.5469 0.1461
0.3193 1.6981 90 0.5241 0.1536
0.2882 1.8868 100 0.5122 0.1910
0.1594 2.0755 110 0.4991 0.2022
0.2488 2.2642 120 0.5354 0.2135
0.2995 2.4528 130 0.5288 0.1985
0.2753 2.6415 140 0.4949 0.2022
0.1867 2.8302 150 0.4521 0.2659
0.2008 3.0189 160 0.4660 0.2659
0.1039 3.2075 170 0.4611 0.2772
0.1859 3.3962 180 0.4913 0.2434
0.1246 3.5849 190 0.4688 0.2809
0.1892 3.7736 200 0.4414 0.2659
0.1974 3.9623 210 0.4506 0.2846
0.1104 4.1509 220 0.4447 0.2809
0.2212 4.3396 230 0.4230 0.2809
0.0833 4.5283 240 0.4459 0.2921
0.1657 4.7170 250 0.4493 0.3109
0.0871 4.9057 260 0.4442 0.3258
0.0645 5.0943 270 0.4282 0.3483
0.1196 5.2830 280 0.4619 0.3221
0.1213 5.4717 290 0.4466 0.3558
0.0972 5.6604 300 0.4357 0.3446
0.0697 5.8491 310 0.4464 0.3184
0.1113 6.0377 320 0.4428 0.3221
0.0646 6.2264 330 0.4488 0.3184
0.0747 6.4151 340 0.4388 0.3296
0.098 6.6038 350 0.4611 0.3109
0.0917 6.7925 360 0.4242 0.3483
0.0803 6.9811 370 0.4088 0.3521
0.0447 7.1698 380 0.4297 0.3371
0.1359 7.3585 390 0.4494 0.3371
0.0784 7.5472 400 0.4054 0.3633
0.076 7.7358 410 0.4434 0.3408
0.1167 7.9245 420 0.4104 0.3558
0.0633 8.1132 430 0.3991 0.3521
0.0881 8.3019 440 0.4055 0.3521
0.1245 8.4906 450 0.4198 0.3483
0.0657 8.6792 460 0.4668 0.2996
0.0989 8.8679 470 0.4557 0.3109
0.0658 9.0566 480 0.4495 0.3258
0.0504 9.2453 490 0.4407 0.3333
0.0931 9.4340 500 0.4467 0.3221
0.0628 9.6226 510 0.4268 0.3371
0.079 9.8113 520 0.4298 0.3184
0.0946 10.0 530 0.4588 0.3333
0.0851 10.1887 540 0.4317 0.3596
0.0583 10.3774 550 0.4501 0.3596
0.0513 10.5660 560 0.4348 0.3670
0.0367 10.7547 570 0.4124 0.3221
0.0733 10.9434 580 0.4516 0.3558
0.0782 11.1321 590 0.4370 0.3371
0.0258 11.3208 600 0.4887 0.3333
0.0958 11.5094 610 0.4156 0.3333
0.0874 11.6981 620 0.4492 0.3333
0.0496 11.8868 630 0.4383 0.3408
0.056 12.0755 640 0.4333 0.3296
0.0544 12.2642 650 0.4404 0.3408
0.0448 12.4528 660 0.4588 0.3333
0.0627 12.6415 670 0.4185 0.3333
0.0568 12.8302 680 0.4491 0.3558
0.0374 13.0189 690 0.4601 0.3446
0.0581 13.2075 700 0.4208 0.3558
0.0664 13.3962 710 0.4717 0.3446
0.0673 13.5849 720 0.4535 0.3296
0.0881 13.7736 730 0.4785 0.3184
0.0535 13.9623 740 0.4491 0.3446
0.1069 14.1509 750 0.4769 0.3408
0.0611 14.3396 760 0.4417 0.3109
0.0449 14.5283 770 0.4570 0.3296
0.0667 14.7170 780 0.4544 0.3446
0.1015 14.9057 790 0.4569 0.3408
0.0497 15.0943 800 0.4142 0.3558
0.0953 15.2830 810 0.4320 0.3596
0.0738 15.4717 820 0.4462 0.3558
0.0434 15.6604 830 0.4566 0.3708
0.0375 15.8491 840 0.4211 0.3745
0.043 16.0377 850 0.4177 0.3633
0.0799 16.2264 860 0.4191 0.3558
0.0474 16.4151 870 0.4398 0.3558
0.0786 16.6038 880 0.4505 0.3333
0.0426 16.7925 890 0.4500 0.3371
0.0526 16.9811 900 0.4420 0.3408
0.0206 17.1698 910 0.4380 0.3408
0.0345 17.3585 920 0.4423 0.3371
0.07 17.5472 930 0.4548 0.3446
0.0314 17.7358 940 0.4484 0.3408
0.0862 17.9245 950 0.4519 0.3371
0.0583 18.1132 960 0.4492 0.3333
0.082 18.3019 970 0.4653 0.3483
0.0285 18.4906 980 0.4626 0.3408
0.028 18.6792 990 0.4507 0.3371
0.03 18.8679 1000 0.4491 0.3333
0.0542 19.0566 1010 0.4508 0.3446
0.0253 19.2453 1020 0.4587 0.3483
0.039 19.4340 1030 0.4599 0.3521
0.0305 19.6226 1040 0.4548 0.3446
0.0495 19.8113 1050 0.4534 0.3446
0.0403 20.0 1060 0.4533 0.3446

Framework versions

  • Transformers 4.48.1
  • Pytorch 2.5.1+cu121
  • Datasets 3.2.0
  • Tokenizers 0.21.0
Downloads last month
2
Safetensors
Model size
131M params
Tensor type
F32
·
Inference Providers NEW
This model is not currently available via any of the supported third-party Inference Providers, and the HF Inference API does not support transformers models with pipeline type image-text-to-text

Model tree for weathon/3d2smiles_finetune

Base model

microsoft/git-base
Finetuned
(1)
this model