model_type
stringclasses
5 values
model
stringlengths
13
62
AVG
float64
0.04
0.7
CG
float64
0
0.68
EL
float64
0
0.62
FA
float64
0
0.35
HE
float64
0
0.79
MC
float64
0
0.92
MR
float64
0
0.95
MT
float64
0.3
0.86
NLI
float64
0
0.82
QA
float64
0.01
0.77
RC
float64
0.04
0.93
SUM
float64
0
0.29
aio_char_f1
float64
0
0.9
alt-e-to-j_bert_score_ja_f1
float64
0.26
0.88
alt-e-to-j_bleu_ja
float64
0.32
16
alt-e-to-j_comet_wmt22
float64
0.29
0.92
alt-j-to-e_bert_score_en_f1
float64
0.37
0.96
alt-j-to-e_bleu_en
float64
0.02
20.1
alt-j-to-e_comet_wmt22
float64
0.3
0.89
chabsa_set_f1
float64
0
0.62
commonsensemoralja_exact_match
float64
0
0.94
jamp_exact_match
float64
0
0.74
janli_exact_match
float64
0
0.95
jcommonsenseqa_exact_match
float64
0
0.97
jemhopqa_char_f1
float64
0
0.71
jmmlu_exact_match
float64
0
0.76
jnli_exact_match
float64
0
0.9
jsem_exact_match
float64
0
0.81
jsick_exact_match
float64
0
0.87
jsquad_char_f1
float64
0.04
0.93
jsts_pearson
float64
-0.23
0.91
jsts_spearman
float64
-0.19
0.88
kuci_exact_match
float64
0
0.86
mawps_exact_match
float64
0
0.95
mbpp_code_exec
float64
0
0.68
mbpp_pylint_check
float64
0
0.99
mmlu_en_exact_match
float64
0
0.82
niilc_char_f1
float64
0.01
0.7
wiki_coreference_set_f1
float64
0
0.13
wiki_dependency_set_f1
float64
0
0.55
wiki_ner_set_f1
float64
0
0.17
wiki_pas_set_f1
float64
0
0.12
wiki_reading_char_f1
float64
0.02
0.91
wikicorpus-e-to-j_bert_score_ja_f1
float64
0.15
0.87
wikicorpus-e-to-j_bleu_ja
float64
0.17
18.3
wikicorpus-e-to-j_comet_wmt22
float64
0.3
0.87
wikicorpus-j-to-e_bert_score_en_f1
float64
0.26
0.92
wikicorpus-j-to-e_bleu_en
float64
0.03
13.8
wikicorpus-j-to-e_comet_wmt22
float64
0.28
0.78
xlsum_ja_bert_score_ja_f1
float64
0
0.79
xlsum_ja_bleu_ja
float64
0
10.2
xlsum_ja_rouge1
float64
0.05
52.8
xlsum_ja_rouge2
float64
0.01
29.2
xlsum_ja_rouge2_scaling
float64
0
0.29
xlsum_ja_rougeLsum
float64
0.05
44.9
architecture
stringclasses
11 values
precision
stringclasses
2 values
license
stringclasses
12 values
params
float64
0.14
70.6
likes
int64
0
4.03k
revision
stringclasses
1 value
num_few_shot
int64
0
4
add_special_tokens
stringclasses
2 values
llm_jp_eval_version
stringclasses
1 value
vllm_version
stringclasses
1 value
🀝 : base merges and moerges
Steelskull/L3.3-MS-Nevoria-70b
0.376
0.004
0.2198
0.128
0.2761
0.7965
0.004
0.8297
0.7127
0.3552
0.7166
0.093
0.4227
0.8575
12.3766
0.9013
0.9526
17.3647
0.8821
0.2198
0.893
0.6063
0.7819
0.8508
0.2848
0.1926
0.5892
0.791
0.795
0.7166
0.873
0.8481
0.6457
0.004
0.004
0.008
0.3596
0.3581
0
0.0322
0.0177
0
0.59
0.8102
11.1841
0.8086
0.8912
10.7909
0.7266
0.686
3.0961
21.6755
9.2982
0.093
19.3764
LlamaForCausalLM
bfloat16
llama3.3
70.554
26
main
0
False
v1.4.1
v0.6.3.post1
🀝 : base merges and moerges
Steelskull/L3.3-MS-Nevoria-70b
0.6097
0.004
0.5668
0.286
0.7679
0.8721
0.932
0.8501
0.7713
0.6448
0.9191
0.093
0.7115
0.8643
13.241
0.908
0.9564
17.6476
0.8859
0.5668
0.9063
0.6293
0.8819
0.9383
0.6594
0.7235
0.7794
0.798
0.7678
0.9191
0.8859
0.8482
0.7715
0.932
0.004
0.008
0.8122
0.5633
0.0681
0.3587
0.115
0.0443
0.8441
0.8491
15.5805
0.8471
0.9093
12.2107
0.7592
0.686
3.0961
21.6755
9.2982
0.093
19.3764
LlamaForCausalLM
bfloat16
llama3.3
70.554
26
main
4
False
v1.4.1
v0.6.3.post1
πŸ”Ά : fine-tuned
deepseek-ai/DeepSeek-R1-Distill-Qwen-1.5B
0.0586
0
0
0.0146
0.0576
0.0284
0.004
0.3537
0.0189
0.0504
0.1079
0.0094
0.0352
0.5882
0.7584
0.3567
0.7837
3.2584
0.4069
0
0.0183
0.0086
0.0056
0.0527
0.0669
0.0031
0
0.0745
0.0059
0.1079
-0.0302
-0.0385
0.0141
0.004
0
0.002
0.1122
0.0491
0
0
0
0
0.0729
0.5585
0.6304
0.324
0.7589
2.2652
0.3274
0.5325
0.43
4.8434
0.9445
0.0094
4.0182
Qwen2ForCausalLM
bfloat16
mit
1.777
317
main
0
False
v1.4.1
v0.6.3.post1
πŸ”Ά : fine-tuned
deepseek-ai/DeepSeek-R1-Distill-Qwen-1.5B
0.2584
0
0.2205
0.0341
0.2912
0.3551
0.45
0.4364
0.4516
0.153
0.4413
0.0094
0.0634
0.6069
1.6655
0.3812
0.8571
7.6125
0.5894
0.2205
0.4649
0.3333
0.4722
0.3253
0.2969
0.2663
0.629
0.6604
0.163
0.4413
0.2962
0.3143
0.2752
0.45
0
0.002
0.3161
0.0986
0
0.0147
0.0177
0.0031
0.1351
0.5712
1.5419
0.3478
0.7834
4.6722
0.4274
0.5325
0.43
4.8434
0.9445
0.0094
4.0182
Qwen2ForCausalLM
bfloat16
mit
1.777
317
main
4
False
v1.4.1
v0.6.3.post1
πŸ”Ά : fine-tuned
deepseek-ai/DeepSeek-R1-Distill-Qwen-7B
0.1383
0.008
0.0021
0.0335
0.0593
0.3159
0.002
0.5614
0.0158
0.0841
0.4148
0.0243
0.0401
0.6783
2.357
0.4935
0.8875
7.4002
0.7125
0.0021
0.4259
0
0
0.3405
0.1414
0.0065
0.0058
0.0682
0.0049
0.4148
0.0132
0.0367
0.1813
0.002
0.008
0.008
0.112
0.0707
0
0
0
0
0.1673
0.6377
2.58
0.4739
0.8321
4.8825
0.5656
0.5836
0.7515
10.3424
2.4406
0.0243
8.2865
Qwen2ForCausalLM
bfloat16
mit
7.616
165
main
0
False
v1.4.1
v0.6.3.post1
πŸ”Ά : fine-tuned
deepseek-ai/DeepSeek-R1-Distill-Qwen-7B
0.3919
0.008
0.3165
0.0519
0.4668
0.5649
0.716
0.6355
0.5975
0.1942
0.7346
0.0243
0.1072
0.73
3.7524
0.5949
0.9134
10.8855
0.7904
0.3165
0.6804
0.4483
0.6069
0.5853
0.3207
0.4219
0.5707
0.6932
0.6686
0.7346
0.7036
0.6312
0.4291
0.716
0.008
0.008
0.5118
0.1548
0.0075
0.0677
0.0354
0.0093
0.1396
0.6651
3.8939
0.5152
0.8616
7.3599
0.6413
0.5836
0.7515
10.3424
2.4406
0.0243
8.2865
Qwen2ForCausalLM
bfloat16
mit
7.616
165
main
4
False
v1.4.1
v0.6.3.post1
πŸ”Ά : fine-tuned
deepseek-ai/DeepSeek-R1-Distill-Llama-8B
0.414
0
0.3968
0.1202
0.4458
0.5695
0.528
0.7955
0.6011
0.2284
0.8021
0.0661
0.2366
0.8215
8.3346
0.8585
0.9369
13.6565
0.8526
0.3968
0.7427
0.5057
0.5208
0.6238
0.2062
0.3784
0.6935
0.6989
0.5868
0.8021
0.7226
0.7263
0.342
0.528
0
0
0.5132
0.2425
0.0153
0.204
0.0619
0.0415
0.2782
0.7611
7.083
0.752
0.8874
8.888
0.7189
0.6556
1.6419
19.3155
6.6108
0.0661
15.8332
LlamaForCausalLM
bfloat16
mit
8.03
180
main
4
False
v1.4.1
v0.6.3.post1
πŸ”Ά : fine-tuned
deepseek-ai/DeepSeek-R1-Distill-Llama-8B
0.1254
0
0.0109
0.0295
0
0.1344
0
0.6277
0.0756
0.0889
0.346
0.0661
0.0503
0.6832
4.3784
0.6067
0.8849
8.6006
0.7143
0.0109
0
0.1006
0
0.168
0.1712
0
0.0781
0
0.1995
0.346
0.0968
0.0972
0.2353
0
0
0
0
0.0454
0
0
0
0
0.1475
0.657
3.2113
0.5775
0.8438
5.2241
0.6121
0.6556
1.6419
19.3155
6.6108
0.0661
15.8332
LlamaForCausalLM
bfloat16
mit
8.03
180
main
0
False
v1.4.1
v0.6.3.post1
πŸ”Ά : fine-tuned
deepseek-ai/DeepSeek-R1-Distill-Qwen-14B
0.2159
0.0602
0.2253
0.052
0.0076
0.3259
0
0.7427
0.2138
0.1498
0.4906
0.107
0.1055
0.7854
7.071
0.7958
0.913
10.8002
0.8082
0.2253
0.0135
0.4425
0
0.4549
0.1598
0
0.3139
0
0.3126
0.4906
0.805
0.8172
0.5093
0
0.0602
0.1606
0.0152
0.1842
0
0.0075
0.0177
0
0.2349
0.7329
6.2705
0.7001
0.861
7.1496
0.6668
0.6981
2.3878
30.0099
10.6979
0.107
21.9762
Qwen2ForCausalLM
bfloat16
mit
14.77
165
main
0
False
v1.4.1
v0.6.3.post1
πŸ”Ά : fine-tuned
deepseek-ai/DeepSeek-R1-Distill-Qwen-14B
0.5683
0.0602
0.5824
0.2182
0.6743
0.8426
0.82
0.829
0.7597
0.4593
0.8984
0.107
0.4395
0.8466
10.7019
0.8921
0.9486
15.2447
0.8762
0.5824
0.86
0.6063
0.7736
0.9374
0.5082
0.634
0.8237
0.7683
0.8263
0.8984
0.8729
0.8417
0.7304
0.82
0.0602
0.1606
0.7145
0.4304
0.0218
0.2991
0.0973
0.0419
0.6307
0.7977
8.903
0.805
0.8961
9.9181
0.7426
0.6981
2.3878
30.0099
10.6979
0.107
21.9762
Qwen2ForCausalLM
bfloat16
mit
14.77
165
main
4
False
v1.4.1
v0.6.3.post1
πŸ”Ά : fine-tuned
deepseek-ai/DeepSeek-R1-Distill-Qwen-32B
0.5934
0.01
0.5687
0.2596
0.7506
0.884
0.924
0.8438
0.7692
0.522
0.8917
0.1035
0.5201
0.8596
11.7312
0.9057
0.9531
16.7936
0.8839
0.5687
0.8963
0.6667
0.7542
0.9526
0.5339
0.7224
0.8558
0.7942
0.7751
0.8917
0.9015
0.8757
0.803
0.924
0.01
0.0422
0.7789
0.512
0.0147
0.3592
0.1416
0.0281
0.7545
0.8199
10.3248
0.8273
0.9042
10.8877
0.7582
0.693
2.3423
28.412
10.3477
0.1035
21.279
Qwen2ForCausalLM
bfloat16
mit
32.764
457
main
4
False
v1.4.1
v0.6.3.post1
πŸ”Ά : fine-tuned
deepseek-ai/DeepSeek-R1-Distill-Qwen-32B
0.2811
0.01
0.2257
0.1174
0.0001
0.7451
0
0.7844
0.4076
0.1622
0.5362
0.1035
0.1502
0.8153
10.4264
0.8509
0.9328
14.1907
0.8275
0.2257
0.5812
0.592
0.0069
0.9088
0.1515
0
0.6836
0.1351
0.6203
0.5362
0.88
0.8605
0.7454
0
0.01
0.0422
0.0002
0.1851
0.0101
0.0079
0.0295
0.0047
0.5347
0.7639
7.8897
0.7627
0.8832
8.7772
0.6965
0.693
2.3423
28.412
10.3477
0.1035
21.279
Qwen2ForCausalLM
bfloat16
mit
32.764
457
main
0
False
v1.4.1
v0.6.3.post1
πŸ”Ά : fine-tuned
deepseek-ai/DeepSeek-R1-Distill-Llama-70B
0.228
0
0.1595
0.0893
0.0995
0.4451
0
0.7472
0.244
0.2798
0.3557
0.0884
0.3481
0.7544
10.7609
0.7405
0.922
15.5448
0.8018
0.1595
0.3196
0.4828
0.0431
0.63
0.2117
0.0104
0.2403
0.108
0.3458
0.3557
0.8837
0.8617
0.3857
0
0
0
0.1886
0.2796
0
0.0214
0.0192
0.002
0.4037
0.7393
9.4647
0.7131
0.8935
10.0266
0.7332
0.6779
2.6657
23.3446
8.8398
0.0884
18.5641
LlamaForCausalLM
bfloat16
mit
70.554
229
main
0
False
v1.4.1
v0.6.3.post1
πŸ”Ά : fine-tuned
deepseek-ai/DeepSeek-R1-Distill-Llama-70B
0.5827
0
0.5344
0.2325
0.7235
0.8675
0.892
0.8453
0.7467
0.5666
0.9124
0.0884
0.6559
0.8568
11.9492
0.9049
0.9557
17.2704
0.8857
0.5344
0.8945
0.658
0.7569
0.9294
0.5414
0.6769
0.7013
0.7797
0.8374
0.9124
0.8865
0.8589
0.7786
0.892
0
0
0.77
0.5025
0.0173
0.2587
0.0796
0.0501
0.7569
0.8294
13.022
0.8264
0.9098
11.7099
0.7642
0.6779
2.6657
23.3446
8.8398
0.0884
18.5641
LlamaForCausalLM
bfloat16
mit
70.554
229
main
4
False
v1.4.1
v0.6.3.post1
🀝 : base merges and moerges
hotmailuser/QwenSlerp2-14B
0.5637
0.6426
0.1958
0.1486
0.6758
0.857
0.812
0.8365
0.7442
0.3349
0.8493
0.1046
0.3984
0.844
10.0102
0.8975
0.9526
16.0273
0.8835
0.1958
0.8953
0.6466
0.8181
0.9348
0.3148
0.6704
0.7523
0.6774
0.8269
0.8493
0.8873
0.8544
0.7409
0.812
0.6426
0.9739
0.6812
0.2914
0.0188
0.013
0.0177
0.0017
0.6917
0.7953
7.8795
0.8159
0.8962
9.2042
0.749
0.6972
2.9102
27.6057
10.4645
0.1046
24.1234
Qwen2ForCausalLM
bfloat16
apache-2.0
14.766
2
main
0
True
v1.4.1
v0.6.3.post1
🀝 : base merges and moerges
hotmailuser/QwenSlerp2-14B
0.6465
0.6426
0.5667
0.2667
0.7391
0.8815
0.894
0.8453
0.7661
0.5055
0.8991
0.1046
0.5115
0.8611
12.673
0.906
0.9528
16.4635
0.882
0.5667
0.9006
0.6063
0.8097
0.9526
0.5759
0.71
0.8513
0.7696
0.7934
0.8991
0.8869
0.8619
0.7914
0.894
0.6426
0.9739
0.7683
0.429
0.0821
0.377
0.0177
0.0707
0.7862
0.8239
10.4218
0.8353
0.9039
10.7796
0.7577
0.6972
2.9102
27.6057
10.4645
0.1046
24.1234
Qwen2ForCausalLM
bfloat16
apache-2.0
14.766
2
main
4
True
v1.4.1
v0.6.3.post1
🀝 : base merges and moerges
sometimesanotion/Qwen2.5-14B-Vimarckoso-v3
0.6404
0.6225
0.5708
0.2674
0.7343
0.8771
0.89
0.8416
0.7573
0.4963
0.8856
0.1018
0.4873
0.8571
12.2369
0.9025
0.9517
16.3534
0.8806
0.5708
0.8945
0.5776
0.7986
0.95
0.5645
0.7057
0.8513
0.7563
0.8027
0.8856
0.8756
0.8574
0.7869
0.89
0.6225
0.9578
0.7629
0.4371
0.084
0.3804
0.0265
0.0729
0.7732
0.8176
9.7327
0.8274
0.9022
10.4444
0.7558
0.6958
2.6661
26.5851
10.1962
0.1018
23.3302
Qwen2ForCausalLM
bfloat16
apache-2.0
14.766
11
main
4
True
v1.4.1
v0.6.3.post1
🀝 : base merges and moerges
sometimesanotion/Qwen2.5-14B-Vimarckoso-v3
0.5593
0.6225
0.1956
0.1489
0.6996
0.8547
0.824
0.8306
0.7402
0.3182
0.8167
0.1018
0.3472
0.8382
8.9138
0.8897
0.951
15.4776
0.8808
0.1956
0.8955
0.6466
0.8139
0.933
0.2903
0.667
0.7465
0.6622
0.8317
0.8167
0.8881
0.8567
0.7357
0.824
0.6225
0.9578
0.7322
0.317
0.0215
0.0138
0.0177
0.0032
0.6881
0.7882
7.5397
0.8048
0.8953
9.2972
0.7471
0.6958
2.6661
26.5851
10.1962
0.1018
23.3302
Qwen2ForCausalLM
bfloat16
apache-2.0
14.766
11
main
0
True
v1.4.1
v0.6.3.post1
🀝 : base merges and moerges
sometimesanotion/Lamarck-14B-v0.6
0.5635
0.6325
0.2087
0.1495
0.6961
0.8548
0.816
0.8361
0.7384
0.3267
0.8372
0.1024
0.3895
0.8448
9.9323
0.8972
0.9522
15.8286
0.8826
0.2087
0.8945
0.6322
0.8153
0.9321
0.2797
0.6687
0.7473
0.6648
0.8326
0.8372
0.8874
0.8561
0.7376
0.816
0.6325
0.9639
0.7235
0.3111
0.0234
0.0121
0.0088
0.001
0.702
0.7955
7.8447
0.8157
0.8964
9.3314
0.749
0.6958
2.6458
26.9372
10.2555
0.1024
23.5995
Qwen2ForCausalLM
bfloat16
apache-2.0
14.766
13
main
0
True
v1.4.1
v0.6.3.post1
🀝 : base merges and moerges
sometimesanotion/Lamarck-14B-v0.6
0.6466
0.6325
0.5781
0.2665
0.7344
0.8795
0.892
0.8456
0.7643
0.5215
0.8953
0.1024
0.5032
0.8619
12.5569
0.9069
0.9529
16.5154
0.8825
0.5781
0.898
0.5977
0.7986
0.9508
0.5904
0.704
0.8583
0.7689
0.7981
0.8953
0.8809
0.8587
0.7897
0.892
0.6325
0.9639
0.7648
0.4709
0.0791
0.376
0.0177
0.0775
0.7821
0.8235
10.4555
0.8351
0.9041
10.7219
0.7578
0.6958
2.6458
26.9372
10.2555
0.1024
23.5995
Qwen2ForCausalLM
bfloat16
apache-2.0
14.766
13
main
4
True
v1.4.1
v0.6.3.post1
🀝 : base merges and moerges
maldv/Qwentile2.5-32B-Instruct
0.5715
0.3233
0.234
0.1439
0.7444
0.8716
0.866
0.8454
0.7511
0.4806
0.9113
0.1146
0.4784
0.8528
12.0338
0.9053
0.9549
16.6195
0.8865
0.234
0.9033
0.6322
0.7597
0.9357
0.5139
0.7182
0.8426
0.803
0.7179
0.9113
0.8892
0.8675
0.7757
0.866
0.3233
0.4116
0.7705
0.4496
0.0025
0.0069
0.0088
0.0082
0.6929
0.8076
9.3966
0.8356
0.8995
9.9257
0.7543
0.706
3.1737
28.67
11.4479
0.1146
24.92
Qwen2ForCausalLM
bfloat16
apache-2.0
32.764
31
main
0
True
v1.4.1
v0.6.3.post1
🀝 : base merges and moerges
maldv/Qwentile2.5-32B-Instruct
0.6457
0.3233
0.5805
0.2917
0.7864
0.8992
0.946
0.8506
0.8059
0.5826
0.9213
0.1146
0.5838
0.8662
13.2438
0.91
0.9557
17.5366
0.8856
0.5805
0.9058
0.6897
0.8236
0.9607
0.6368
0.7639
0.8948
0.8093
0.8121
0.9213
0.903
0.8802
0.8312
0.946
0.3233
0.4116
0.8089
0.5272
0.0321
0.3916
0.1239
0.0975
0.8134
0.8399
12.9473
0.8439
0.9085
11.632
0.7631
0.706
3.1737
28.67
11.4479
0.1146
24.92
Qwen2ForCausalLM
bfloat16
apache-2.0
32.764
31
main
4
True
v1.4.1
v0.6.3.post1
🀝 : base merges and moerges
Saxo/Linkbricks-Horizon-AI-Avengers-V4-32B
0.4838
0.008
0.1455
0.12
0.6601
0.8714
0.82
0.6421
0.7427
0.3368
0.8703
0.1045
0.298
0.7524
10.7871
0.7002
0.8516
15.6005
0.5858
0.1455
0.9023
0.6207
0.7083
0.9339
0.3785
0.6007
0.8394
0.798
0.7471
0.8703
0.8907
0.8715
0.778
0.82
0.008
0.0161
0.7196
0.334
0.0058
0.0024
0
0.0064
0.5853
0.7242
8.8307
0.6758
0.8424
8.9859
0.6066
0.6872
3.1771
26.7645
10.4431
0.1045
23.3063
Qwen2ForCausalLM
bfloat16
apache-2.0
32.764
1
main
0
True
v1.4.1
v0.6.3.post1
🀝 : base merges and moerges
Saxo/Linkbricks-Horizon-AI-Avengers-V4-32B
0.6106
0.008
0.5759
0.2869
0.7833
0.8965
0.936
0.821
0.8244
0.5672
0.9135
0.1045
0.578
0.8614
13.1877
0.9021
0.9527
17.7238
0.877
0.5759
0.9026
0.7356
0.8542
0.9571
0.591
0.7628
0.8981
0.8005
0.8334
0.9135
0.888
0.8779
0.8298
0.936
0.008
0.0161
0.8037
0.5324
0.0455
0.3418
0.1504
0.105
0.7916
0.7888
11.769
0.7714
0.8964
11.1378
0.7332
0.6872
3.1771
26.7645
10.4431
0.1045
23.3063
Qwen2ForCausalLM
bfloat16
apache-2.0
32.764
1
main
4
True
v1.4.1
v0.6.3.post1
🀝 : base merges and moerges
rombodawg/Rombos-LLM-V2.5-Qwen-32b
0.6558
0.5321
0.588
0.2738
0.7769
0.8969
0.944
0.8476
0.8105
0.541
0.9054
0.0973
0.5542
0.8643
13.2226
0.9077
0.9553
17.6746
0.8859
0.588
0.8985
0.6753
0.8417
0.958
0.5672
0.7535
0.8977
0.7797
0.8579
0.9054
0.8895
0.8772
0.8341
0.944
0.5321
0.761
0.8003
0.5015
0.054
0.3845
0
0.1132
0.8175
0.8291
11.0412
0.8387
0.9044
11.1234
0.7581
0.6931
2.7803
25.9887
9.7254
0.0973
22.6735
Qwen2ForCausalLM
bfloat16
apache-2.0
32.764
52
main
4
True
v1.4.1
v0.6.3.post1
🀝 : base merges and moerges
rombodawg/Rombos-LLM-V2.5-Qwen-32b
0.5456
0.5321
0.104
0.1468
0.5748
0.874
0.794
0.8386
0.7642
0.3892
0.8869
0.0973
0.441
0.8486
11.2433
0.901
0.9512
15.6734
0.8802
0.104
0.9043
0.6494
0.7944
0.9303
0.2681
0.5645
0.8188
0.7961
0.7623
0.8869
0.8954
0.8763
0.7875
0.794
0.5321
0.761
0.5851
0.4585
0.0281
0.0068
0.0354
0.0048
0.6589
0.8002
8.7465
0.8266
0.8967
9.6053
0.7466
0.6931
2.7803
25.9887
9.7254
0.0973
22.6735
Qwen2ForCausalLM
bfloat16
apache-2.0
32.764
52
main
0
True
v1.4.1
v0.6.3.post1
🀝 : base merges and moerges
huihui-ai/QwQ-32B-Coder-Fusion-9010
0.5861
0.6787
0.2663
0.1482
0.726
0.851
0.818
0.8413
0.7173
0.3712
0.9083
0.1205
0.2883
0.8508
11.8046
0.9031
0.9537
17.0002
0.8846
0.2663
0.8875
0.5891
0.7181
0.9205
0.4199
0.7055
0.7847
0.7898
0.7047
0.9083
0.8834
0.8608
0.745
0.818
0.6787
0.9598
0.7465
0.4055
0
0.0058
0.0265
0.0117
0.6968
0.7999
8.7856
0.8252
0.8967
9.8045
0.7525
0.7107
3.3568
30.069
12.0633
0.1205
25.87
Qwen2ForCausalLM
bfloat16
apache-2.0
32.764
6
main
0
True
v1.4.1
v0.6.3.post1
🀝 : base merges and moerges
huihui-ai/QwQ-32B-Coder-Fusion-9010
0.6687
0.6787
0.5718
0.2842
0.78
0.8863
0.928
0.8459
0.7777
0.5614
0.9211
0.1205
0.5616
0.8603
12.6168
0.9071
0.9546
17.0898
0.8845
0.5718
0.8873
0.6552
0.7681
0.9508
0.6308
0.7616
0.8755
0.8037
0.7859
0.9211
0.8806
0.8675
0.8209
0.928
0.6787
0.9598
0.7983
0.4917
0.0216
0.4054
0.1416
0.083
0.7692
0.8296
11.9736
0.8307
0.9055
11.4388
0.7614
0.7107
3.3568
30.069
12.0633
0.1205
25.87
Qwen2ForCausalLM
bfloat16
apache-2.0
32.764
6
main
4
True
v1.4.1
v0.6.3.post1
🟦 : RL-tuned (Preference optimization)
OpenBuddy/openbuddy-qwq-32b-v24.1-200k
0.5283
0.0863
0.238
0.1531
0.6368
0.8645
0.864
0.8462
0.7195
0.415
0.903
0.0851
0.387
0.8641
13.207
0.9071
0.9555
17.3732
0.8864
0.238
0.8955
0.6006
0.7278
0.9374
0.4488
0.6668
0.744
0.7803
0.7449
0.903
0.8863
0.8652
0.7607
0.864
0.0863
0.2048
0.6069
0.4092
0
0.001
0
0.0004
0.764
0.8155
10.073
0.8312
0.9045
10.8566
0.76
0.6852
2.963
20.4271
8.5192
0.0851
17.6866
Qwen2ForCausalLM
bfloat16
apache-2.0
32.764
2
main
0
True
v1.4.1
v0.6.3.post1
🟦 : RL-tuned (Preference optimization)
OpenBuddy/openbuddy-qwq-32b-v24.1-200k
0.6198
0.0863
0.5945
0.3002
0.7728
0.8924
0.934
0.8548
0.797
0.5719
0.9285
0.0851
0.5986
0.8715
13.5172
0.9124
0.9577
18.6591
0.8899
0.5945
0.8958
0.6609
0.8333
0.9562
0.5515
0.7436
0.8961
0.8043
0.7905
0.9285
0.8991
0.8794
0.8251
0.934
0.0863
0.2048
0.8021
0.5658
0.0351
0.4103
0.1593
0.084
0.8123
0.8473
13.5696
0.8484
0.9105
11.441
0.7686
0.6852
2.963
20.4271
8.5192
0.0851
17.6866
Qwen2ForCausalLM
bfloat16
apache-2.0
32.764
2
main
4
True
v1.4.1
v0.6.3.post1
🀝 : base merges and moerges
hotmailuser/RombosBeagle-v2beta-MGS-32B
0.5441
0.5301
0.1023
0.1469
0.5726
0.8745
0.782
0.8383
0.7643
0.3906
0.8882
0.0957
0.4415
0.8487
11.2853
0.8999
0.9511
15.5862
0.8798
0.1023
0.9048
0.6494
0.7944
0.9312
0.2681
0.5626
0.8176
0.7986
0.7615
0.8882
0.895
0.8761
0.7874
0.782
0.5301
0.755
0.5826
0.4623
0.0281
0.0065
0.0354
0.0071
0.6575
0.8002
8.752
0.8266
0.8968
9.5491
0.7469
0.6918
2.7757
25.4437
9.5626
0.0957
22.244
Qwen2ForCausalLM
bfloat16
apache-2.0
32.764
1
main
0
True
v1.4.1
v0.6.3.post1
🀝 : base merges and moerges
hotmailuser/RombosBeagle-v2beta-MGS-32B
0.6548
0.5301
0.5846
0.2732
0.7756
0.8965
0.944
0.8476
0.8103
0.5395
0.9056
0.0957
0.5552
0.8642
13.2052
0.9078
0.9552
17.689
0.8858
0.5846
0.8985
0.6753
0.8417
0.9571
0.5657
0.7515
0.8965
0.7809
0.8571
0.9056
0.8902
0.8786
0.8338
0.944
0.5301
0.755
0.7998
0.4977
0.0511
0.3807
0
0.115
0.8194
0.8292
11.0168
0.8389
0.9043
11.1169
0.758
0.6918
2.7757
25.4437
9.5626
0.0957
22.244
Qwen2ForCausalLM
bfloat16
apache-2.0
32.764
1
main
4
True
v1.4.1
v0.6.3.post1
πŸ”Ά : fine-tuned
Sakalti/SJT-2.4B-Alpha
0.3788
0.0321
0.3924
0.0674
0.3594
0.493
0.498
0.7307
0.5155
0.236
0.7913
0.0508
0.1912
0.8008
7.1291
0.8195
0.9237
11.0633
0.816
0.3924
0.7087
0.3276
0.5069
0.4602
0.3008
0.3474
0.5616
0.5852
0.5961
0.7913
0.5187
0.4502
0.3101
0.498
0.0321
0.1245
0.3714
0.2159
0.0014
0.0689
0.0177
0.0032
0.2455
0.7357
5.934
0.68
0.8566
7.3766
0.6074
0.6382
1.3203
15.799
5.0597
0.0508
13.0353
Qwen2ForCausalLM
float16
apache-2.0
2.199
0
main
4
False
v1.4.1
v0.6.3.post1
πŸ”Ά : fine-tuned
Sakalti/SJT-2.4B-Alpha
0.2425
0.0321
0
0.0293
0.3214
0.4097
0.03
0.6012
0.5011
0.0471
0.6442
0.0508
0.0582
0.7032
2.0223
0.6183
0.8669
6.7091
0.7416
0
0.5341
0.3391
0.4875
0.4155
0
0.3188
0.5534
0.5082
0.6174
0.6442
0.218
0.2095
0.2795
0.03
0.0321
0.1245
0.324
0.0832
0
0.0026
0
0
0.1439
0.6642
1.6266
0.5125
0.7867
2.7398
0.5325
0.6382
1.3203
15.799
5.0597
0.0508
13.0353
Qwen2ForCausalLM
float16
apache-2.0
2.199
0
main
0
False
v1.4.1
v0.6.3.post1
🀝 : base merges and moerges
sometimesanotion/Lamarck-14B-v0.7
0.6498
0.6285
0.5702
0.2694
0.7388
0.883
0.898
0.8465
0.7667
0.5354
0.9054
0.1059
0.5178
0.8622
12.3796
0.907
0.9535
16.7361
0.8834
0.5702
0.9021
0.6063
0.8
0.9508
0.5888
0.7111
0.8607
0.7765
0.7897
0.9054
0.8882
0.8633
0.796
0.898
0.6285
0.9819
0.7665
0.4996
0.0787
0.3735
0.0442
0.0775
0.773
0.826
10.7207
0.8375
0.9042
10.8731
0.7581
0.7013
2.7974
28.0711
10.6098
0.1059
24.5474
Qwen2ForCausalLM
bfloat16
apache-2.0
14.766
15
main
4
True
v1.4.1
v0.6.3.post1
🀝 : base merges and moerges
sometimesanotion/Lamarck-14B-v0.7
0.5725
0.6285
0.2397
0.1543
0.699
0.8587
0.816
0.838
0.7363
0.3618
0.8595
0.1059
0.4012
0.8453
10.4001
0.8985
0.9533
15.7901
0.8838
0.2397
0.8983
0.6264
0.8097
0.9374
0.3522
0.6653
0.7777
0.6477
0.82
0.8595
0.8918
0.8599
0.7403
0.816
0.6285
0.9819
0.7327
0.3321
0.0257
0.0139
0.0354
0.0017
0.6949
0.7961
8.0936
0.8187
0.8967
9.2398
0.7508
0.7013
2.7974
28.0711
10.6098
0.1059
24.5474
Qwen2ForCausalLM
bfloat16
apache-2.0
14.766
15
main
0
True
v1.4.1
v0.6.3.post1
🀝 : base merges and moerges
sometimesanotion/Qwenvergence-14B-v10
0.6413
0.6546
0.5543
0.2753
0.7218
0.8712
0.888
0.8431
0.7593
0.5098
0.8743
0.1032
0.4825
0.8588
12.1229
0.9031
0.9519
16.065
0.8818
0.5543
0.8868
0.5805
0.775
0.9464
0.5523
0.6888
0.8624
0.762
0.8165
0.8743
0.8827
0.8572
0.7805
0.888
0.6546
0.9639
0.7547
0.4945
0.0581
0.3931
0.0708
0.0881
0.7662
0.8193
10.2449
0.8312
0.9032
10.763
0.7562
0.6988
2.6335
27.1809
10.3292
0.1032
23.7442
Qwen2ForCausalLM
bfloat16
apache-2.0
14.766
4
main
4
True
v1.4.1
v0.6.3.post1
🀝 : base merges and moerges
sometimesanotion/Qwenvergence-14B-v10
0.5565
0.6546
0.2264
0.1569
0.6867
0.8399
0.822
0.832
0.7163
0.3094
0.7738
0.1032
0.3037
0.8397
9.7849
0.8921
0.9501
15.0714
0.8794
0.2264
0.8858
0.6178
0.7736
0.9169
0.277
0.6459
0.7621
0.6042
0.824
0.7738
0.8837
0.8561
0.7171
0.822
0.6546
0.9639
0.7275
0.3475
0.0242
0.0077
0.0619
0.0027
0.6879
0.7906
7.6428
0.8115
0.8951
9.3056
0.7448
0.6988
2.6335
27.1809
10.3292
0.1032
23.7442
Qwen2ForCausalLM
bfloat16
apache-2.0
14.766
4
main
0
True
v1.4.1
v0.6.3.post1
πŸ”Ά : fine-tuned
Sakalti/ultiima-14B-v0.2
0.6467
0.6365
0.575
0.2668
0.7355
0.8799
0.892
0.8458
0.7649
0.5192
0.8954
0.1031
0.5032
0.8615
12.4979
0.907
0.9529
16.4395
0.8828
0.575
0.8988
0.6006
0.8014
0.9508
0.5877
0.7066
0.8574
0.7658
0.7993
0.8954
0.8828
0.8597
0.7901
0.892
0.6365
0.9598
0.7645
0.4667
0.0803
0.3744
0.0177
0.0795
0.7823
0.8238
10.421
0.8349
0.9041
10.7092
0.7585
0.6957
2.7022
26.9496
10.3204
0.1031
23.6729
Qwen2ForCausalLM
float16
14.766
2
main
4
False
v1.4.1
v0.6.3.post1
πŸ”Ά : fine-tuned
Sakalti/ultiima-14B-v0.2
0.5638
0.6365
0.2048
0.1483
0.6968
0.8557
0.814
0.8363
0.7383
0.3298
0.8377
0.1031
0.3881
0.8449
9.9801
0.8975
0.9519
15.8518
0.8821
0.2048
0.895
0.6322
0.8139
0.933
0.2797
0.6679
0.7457
0.6654
0.8344
0.8377
0.8872
0.8551
0.7391
0.814
0.6365
0.9598
0.7257
0.3216
0.0234
0.0131
0.0088
0.001
0.695
0.7954
7.9523
0.8162
0.8965
9.2722
0.7492
0.6957
2.7022
26.9496
10.3204
0.1031
23.6729
Qwen2ForCausalLM
float16
14.766
2
main
0
False
v1.4.1
v0.6.3.post1
🀝 : base merges and moerges
FuseAI/FuseO1-DeepSeekR1-QwQ-SkyT1-32B-Preview
0.2608
0.002
0.2205
0.126
0
0.6027
0
0.7872
0.313
0.1733
0.5396
0.1041
0.1515
0.8177
10.7503
0.8543
0.9332
14.4756
0.8284
0.2205
0.1428
0.592
0.0014
0.9178
0.1773
0
0.5357
0.036
0.3998
0.5396
0.8933
0.8706
0.7476
0
0.002
0.0422
0
0.191
0.0151
0.0122
0.0295
0.0043
0.5692
0.7653
7.9661
0.766
0.8843
8.9062
0.7
0.6928
2.4424
28.4702
10.4272
0.1041
21.1987
Qwen2ForCausalLM
bfloat16
apache-2.0
32.764
75
main
0
False
v1.4.1
v0.6.3.post1
🀝 : base merges and moerges
FuseAI/FuseO1-DeepSeekR1-QwQ-SkyT1-32B-Preview
0.5968
0.002
0.5819
0.2656
0.7465
0.8836
0.928
0.845
0.7746
0.5402
0.893
0.1041
0.5345
0.8605
12.2451
0.9061
0.9536
17.1126
0.885
0.5819
0.8988
0.6638
0.7611
0.9482
0.5566
0.7148
0.8726
0.7929
0.7826
0.893
0.9045
0.8795
0.8039
0.928
0.002
0.0422
0.7782
0.5295
0.0148
0.366
0.1504
0.0417
0.755
0.8214
10.3568
0.8304
0.9042
10.9109
0.7585
0.6928
2.4424
28.4702
10.4272
0.1041
21.1987
Qwen2ForCausalLM
bfloat16
apache-2.0
32.764
75
main
4
False
v1.4.1
v0.6.3.post1
πŸ”Ά : fine-tuned
cyberagent/DeepSeek-R1-Distill-Qwen-32B-Japanese
0.2749
0.0402
0.25
0.1361
0
0.6748
0
0.796
0.2664
0.193
0.5597
0.1074
0.248
0.8261
11.0019
0.8638
0.9344
14.2124
0.8309
0.25
0.3545
0.592
0.0014
0.9196
0.1089
0
0.3595
0.0069
0.3722
0.5597
0.8778
0.8498
0.7505
0
0.0402
0.1285
0
0.222
0.009
0.0257
0.0248
0
0.6211
0.7751
8.1994
0.7824
0.8841
8.6625
0.7068
0.6982
2.4046
29.7102
10.7517
0.1074
21.3666
Qwen2ForCausalLM
bfloat16
mit
32.764
178
main
0
False
v1.4.1
v0.6.3.post1
πŸ”Ά : fine-tuned
cyberagent/DeepSeek-R1-Distill-Qwen-32B-Japanese
0.601
0.0402
0.586
0.2803
0.7474
0.8854
0.918
0.8457
0.767
0.5242
0.9097
0.1074
0.5472
0.8636
12.8788
0.9101
0.9534
16.845
0.8841
0.586
0.895
0.6494
0.7694
0.9562
0.5163
0.7238
0.8595
0.8024
0.7544
0.9097
0.9031
0.8755
0.8051
0.918
0.0402
0.1285
0.7709
0.5091
0.0127
0.374
0.1593
0.0712
0.7844
0.8247
10.5173
0.8351
0.9014
10.6611
0.7535
0.6982
2.4046
29.7102
10.7517
0.1074
21.3666
Qwen2ForCausalLM
bfloat16
mit
32.764
178
main
4
False
v1.4.1
v0.6.3.post1
πŸ”Ά : fine-tuned
cyberagent/DeepSeek-R1-Distill-Qwen-14B-Japanese
0.253
0
0.2317
0.0406
0.3131
0.393
0.018
0.6805
0.3714
0.1185
0.5089
0.1072
0.0708
0.7744
6.0921
0.7583
0.8754
7.701
0.7673
0.2317
0.0797
0.4655
0.3458
0.6122
0.1289
0.1694
0.463
0.0884
0.4942
0.5089
0.4599
0.4776
0.4872
0.018
0
0
0.4567
0.1558
0.005
0.0031
0.0177
0
0.1774
0.7078
4.5242
0.6137
0.8084
3.87
0.5826
0.702
1.683
30.9219
10.7181
0.1072
22.9303
Qwen2ForCausalLM
bfloat16
mit
14.77
49
main
0
False
v1.4.1
v0.6.3.post1
πŸ”Ά : fine-tuned
cyberagent/DeepSeek-R1-Distill-Qwen-14B-Japanese
0.5644
0
0.5718
0.2271
0.6729
0.8498
0.804
0.833
0.7556
0.4902
0.8962
0.1072
0.4801
0.8557
11.6901
0.902
0.9477
14.7898
0.8744
0.5718
0.8722
0.5833
0.7694
0.9428
0.5136
0.6365
0.8381
0.7797
0.8076
0.8962
0.8637
0.8333
0.7342
0.804
0
0
0.7092
0.477
0.0161
0.3131
0.0796
0.0522
0.6746
0.8067
8.927
0.818
0.8948
9.5409
0.7376
0.702
1.683
30.9219
10.7181
0.1072
22.9303
Qwen2ForCausalLM
bfloat16
mit
14.77
49
main
4
False
v1.4.1
v0.6.3.post1