214 results
eval_name
stringlengths
12
111
Precision
stringclasses
3 values
Type
stringclasses
7 values
T
stringclasses
7 values
Weight type
stringclasses
2 values
Architecture
stringclasses
64 values
Model
stringlengths
355
689
fullname
stringlengths
4
102
Model sha
stringlengths
0
40
Average ⬆️
float64
0.74
52.1
Hub License
stringclasses
27 values
Hub ❤️
int64
0
6.08k
#Params (B)
float64
-1
141
Available on the hub
bool
2 classes
MoE
bool
2 classes
Flagged
bool
2 classes
Chat Template
bool
2 classes
CO₂ cost (kg)
float64
0.04
187
IFEval Raw
float64
0
0.9
IFEval
float64
0
90
BBH Raw
float64
0.22
0.83
BBH
float64
0.25
76.7
MATH Lvl 5 Raw
float64
0
0.71
MATH Lvl 5
float64
0
71.5
GPQA Raw
float64
0.21
0.47
GPQA
float64
0
29.4
MUSR Raw
float64
0.29
0.6
MUSR
float64
0
38.5
MMLU-PRO Raw
float64
0.1
0.73
MMLU-PRO
float64
0
70
Merged
bool
2 classes
Official Providers
bool
2 classes
Upload To Hub Date
stringclasses
522 values
Submission Date
stringclasses
262 values
Generation
int64
0
10
Base Model
stringlengths
4
102
Daemontatox_AetherDrake-SFT_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Daemontatox/AetherDrake-SFT" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Daemontatox/AetherDrake-SFT</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Daemontatox__AetherDrake-SFT-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Daemontatox/AetherDrake-SFT
17a0f90f0c06f2adc885faccd0a6172a7b996126
22.917961
apache-2.0
1
8.03
true
false
false
false
2.196694
0.48128
48.127967
0.487201
27.139252
0.151057
15.10574
0.32047
9.395973
0.408844
9.972135
0.3499
27.766696
false
false
2024-12-24
2024-12-25
1
Daemontatox/AetherDrake-SFT (Merge)
Daemontatox_CogitoZ_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/Daemontatox/CogitoZ" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Daemontatox/CogitoZ</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Daemontatox__CogitoZ-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Daemontatox/CogitoZ
7079c4e915e6f549df9f1c3fa3a3260f9a835f48
39.383291
apache-2.0
0
32.764
true
false
false
true
8.863382
0.396724
39.672403
0.673449
53.889571
0.524169
52.416918
0.395134
19.35123
0.47926
19.940885
0.559259
51.028738
false
false
2025-01-03
2025-02-13
1
Daemontatox/CogitoZ (Merge)
Daemontatox_PathfinderAI_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/Daemontatox/PathfinderAI" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Daemontatox/PathfinderAI</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Daemontatox__PathfinderAI-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Daemontatox/PathfinderAI
14c6a91351006b7be0aff85292733470ff1b546d
38.131314
apache-2.0
0
32.764
true
false
false
false
4.540918
0.374517
37.451739
0.666785
52.646547
0.475831
47.583082
0.394295
19.239374
0.485833
20.829167
0.559342
51.037973
false
false
2024-12-24
2024-12-25
1
Daemontatox/PathfinderAI (Merge)
Daemontatox_SphinX_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/Daemontatox/SphinX" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Daemontatox/SphinX</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Daemontatox__SphinX-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Daemontatox/SphinX
3da400d648b198211c81f61421bdcefac8073506
29.87478
apache-2.0
2
7.616
true
false
false
false
1.304317
0.572504
57.250429
0.544058
34.712451
0.308157
30.81571
0.297819
6.375839
0.4405
12.695833
0.436586
37.398419
false
false
2024-12-21
2024-12-31
1
Daemontatox/SphinX (Merge)
Daemontatox_PathfinderAI_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/Daemontatox/PathfinderAI" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Daemontatox/PathfinderAI</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Daemontatox__PathfinderAI-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Daemontatox/PathfinderAI
7271fc7d08fca9b12c49b40af6245a982273a5c3
36.548768
apache-2.0
0
32.764
true
false
false
true
9.451441
0.485501
48.550069
0.662734
52.322163
0.484139
48.413897
0.309564
7.941834
0.425594
11.599219
0.554189
50.465426
false
false
2024-12-24
2024-12-30
1
Daemontatox/PathfinderAI (Merge)
Daemontatox_PathFinderAi3.0_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/Daemontatox/PathFinderAi3.0" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Daemontatox/PathFinderAi3.0</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Daemontatox__PathFinderAi3.0-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Daemontatox/PathFinderAi3.0
6c9aa17cee032523ce17de111d6865e33825cf1d
40.458694
apache-2.0
1
32.764
true
false
false
true
8.094724
0.427099
42.709899
0.688422
55.538355
0.504532
50.453172
0.408557
21.14094
0.480688
20.052604
0.575715
52.857196
false
false
2024-12-31
2025-01-21
1
Daemontatox/PathFinderAI3.0
Daemontatox_Sphinx2.0_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/Daemontatox/Sphinx2.0" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Daemontatox/Sphinx2.0</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Daemontatox__Sphinx2.0-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Daemontatox/Sphinx2.0
16abdfe2c214dc1da6bfe654b3d6716fcc8450e2
37.694185
apache-2.0
0
14.77
true
false
false
true
3.59265
0.712313
71.231333
0.647284
49.396752
0.401813
40.181269
0.293624
5.816555
0.426031
13.053906
0.518368
46.485298
false
false
2024-12-30
2024-12-30
1
Daemontatox/Sphinx2.0 (Merge)
Daemontatox_Llama3.3-70B-CogniLink_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Daemontatox/Llama3.3-70B-CogniLink" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Daemontatox/Llama3.3-70B-CogniLink</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Daemontatox__Llama3.3-70B-CogniLink-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Daemontatox/Llama3.3-70B-CogniLink
69f134f69472a84d104d3ef0c0b1dd200b9a599d
42.774714
apache-2.0
1
70.554
true
false
false
true
32.378236
0.693104
69.31043
0.666833
52.124663
0.413897
41.389728
0.44547
26.06264
0.487698
21.395573
0.517287
46.365248
false
false
2025-01-10
2025-03-02
1
Daemontatox/Llama3.3-70B-CogniLink (Merge)
Daemontatox_mini-Cogito-R1_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/Daemontatox/mini-Cogito-R1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Daemontatox/mini-Cogito-R1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Daemontatox__mini-Cogito-R1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Daemontatox/mini-Cogito-R1
7d86cfe7522a080853a6c25f7115fa5106c9d671
11.629718
apache-2.0
4
1.777
true
false
false
false
0.610988
0.229837
22.983683
0.328049
6.038995
0.274924
27.492447
0.286913
4.9217
0.344698
2.98724
0.148188
5.354241
false
false
2025-02-22
2025-02-22
1
Daemontatox/mini-Cogito-R1 (Merge)
Daemontatox_AetherUncensored_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Daemontatox/AetherUncensored" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Daemontatox/AetherUncensored</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Daemontatox__AetherUncensored-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Daemontatox/AetherUncensored
e498d645faab591062c6919a98b35656e2d0c783
18.374864
0
8.03
false
false
false
false
1.478506
0.404193
40.41931
0.446313
21.678618
0.145015
14.501511
0.288591
5.145414
0.374677
9.501302
0.271027
19.003029
false
false
2025-01-09
0
Removed
Daemontatox_AetherTOT_bfloat16
bfloat16
🌸 multimodal
🌸
Original
MllamaForConditionalGeneration
<a target="_blank" href="https://huggingface.co/Daemontatox/AetherTOT" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Daemontatox/AetherTOT</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Daemontatox__AetherTOT-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Daemontatox/AetherTOT
71d99f8fb69276422daae61222e57087000c05b0
22.874708
apache-2.0
0
10.67
true
false
false
false
0.708698
0.43829
43.82904
0.503431
29.031857
0.14426
14.425982
0.323826
9.8434
0.405188
9.248438
0.377826
30.869533
false
false
2024-12-27
2024-12-28
2
meta-llama/Llama-3.2-11B-Vision-Instruct
Daemontatox_CogitoDistil_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/Daemontatox/CogitoDistil" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Daemontatox/CogitoDistil</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Daemontatox__CogitoDistil-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Daemontatox/CogitoDistil
f9a5302a0c4b464c44d79f745b8498ab51dd97de
17.180474
0
7.616
false
false
false
true
1.629079
0.277648
27.764775
0.367677
11.948759
0.392749
39.274924
0.259228
1.230425
0.37549
4.802865
0.26255
18.061096
false
false
2025-01-22
0
Removed
Daemontatox_DocumentCogito_float16
float16
🌸 multimodal
🌸
Original
MllamaForConditionalGeneration
<a target="_blank" href="https://huggingface.co/Daemontatox/DocumentCogito" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Daemontatox/DocumentCogito</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Daemontatox__DocumentCogito-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Daemontatox/DocumentCogito
9bdbfd8f330754c4103822ce180e0e3e3ce0973e
29.108156
apache-2.0
1
10.67
true
false
false
true
0.711757
0.777035
77.703493
0.518673
31.184823
0.219789
21.978852
0.293624
5.816555
0.391052
7.548177
0.373753
30.417036
false
false
2025-01-16
2025-03-09
2
meta-llama/Llama-3.2-11B-Vision-Instruct
Daemontatox_ReasonTest_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/Daemontatox/ReasonTest" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Daemontatox/ReasonTest</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Daemontatox__ReasonTest-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Daemontatox/ReasonTest
8e81cfddd97a13d81d6207eb72be8b730a7ca12f
25.858233
0
3.808
false
false
false
false
1.340629
0.407965
40.796531
0.543526
35.375037
0.213746
21.374622
0.318792
9.17226
0.431542
12.076042
0.427194
36.354905
false
false
2024-12-31
0
Removed
Daemontatox_TinySphinx_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/Daemontatox/TinySphinx" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Daemontatox/TinySphinx</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Daemontatox__TinySphinx-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Daemontatox/TinySphinx
62172ccb670864070581498fb12e7d2594ac3a77
8.167167
0
0.247
false
false
false
false
1.007256
0.25669
25.669003
0.330984
6.546576
0.043051
4.305136
0.27349
3.131991
0.33276
1.595052
0.169797
7.755245
false
false
2024-12-31
0
Removed
Daemontatox_CogitoZ14_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/Daemontatox/CogitoZ14" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Daemontatox/CogitoZ14</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Daemontatox__CogitoZ14-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Daemontatox/CogitoZ14
df5320d7ff115f1e39e42506ed86a340eb2d12e0
34.38343
0
14.77
false
false
false
true
5.19345
0.663703
66.370342
0.629751
46.479352
0.422205
42.220544
0.316275
8.836689
0.405875
9.067708
0.399934
33.325946
false
false
2025-01-07
0
Removed
Daemontatox_NemoR_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/Daemontatox/NemoR" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Daemontatox/NemoR</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Daemontatox__NemoR-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Daemontatox/NemoR
688f1a4c3c69fe9c6440cad7919ab602ae61fa39
18.073998
0
6.124
false
false
false
false
2.261325
0.228738
22.873753
0.519407
31.60552
0.083082
8.308157
0.327181
10.290828
0.390802
9.916927
0.329039
25.448803
false
false
2024-12-31
0
Removed
Daemontatox_RA2.0_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/Daemontatox/RA2.0" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Daemontatox/RA2.0</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Daemontatox__RA2.0-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Daemontatox/RA2.0
e1505dd5f9f2c8549cc852a1aca3ec545638e813
23.232563
0
7.616
false
false
false
false
1.327376
0.378389
37.838934
0.488869
28.471838
0.383686
38.36858
0.305369
7.38255
0.409125
9.373958
0.261636
17.959515
false
false
2025-01-01
0
Removed
Daemontatox_AetherSett_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/Daemontatox/AetherSett" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Daemontatox/AetherSett</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Daemontatox__AetherSett-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Daemontatox/AetherSett
d8d86c6dc1b693192931b02e39290eca331ae84e
31.420123
apache-2.0
1
7.616
true
false
false
false
1.964645
0.536959
53.69586
0.545162
34.744146
0.397281
39.728097
0.307886
7.718121
0.460312
16.205729
0.427859
36.428783
false
false
2024-12-30
2024-12-30
3
Qwen/Qwen2.5-7B
Daemontatox_DocumentCogito_bfloat16
bfloat16
🌸 multimodal
🌸
Original
MllamaForConditionalGeneration
<a target="_blank" href="https://huggingface.co/Daemontatox/DocumentCogito" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Daemontatox/DocumentCogito</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Daemontatox__DocumentCogito-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Daemontatox/DocumentCogito
23dcfc6bf91d84db1c977b151fd0923270d3e3ef
24.220439
apache-2.0
1
10.67
true
false
false
false
1.413317
0.506434
50.643404
0.511156
29.793609
0.163142
16.314199
0.316275
8.836689
0.397313
8.597396
0.380236
31.137337
false
false
2025-01-16
2025-01-16
2
meta-llama/Llama-3.2-11B-Vision-Instruct
Daemontatox_Llama_cot_float16
float16
🌸 multimodal
🌸
Original
MllamaForConditionalGeneration
<a target="_blank" href="https://huggingface.co/Daemontatox/Llama_cot" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Daemontatox/Llama_cot</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Daemontatox__Llama_cot-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Daemontatox/Llama_cot
e0b1e5ec44b5dac34aa3bf99e0faf7c6c3f1390f
27.115742
0
10.67
false
false
false
true
0.750703
0.754878
75.487817
0.483837
26.866583
0.202417
20.241692
0.291107
5.480984
0.38724
6.638281
0.351812
27.979093
false
false
2025-03-09
0
Removed
Daemontatox_TinySphinx2.0_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/Daemontatox/TinySphinx2.0" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Daemontatox/TinySphinx2.0</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Daemontatox__TinySphinx2.0-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Daemontatox/TinySphinx2.0
accc28aa00084fe89801baa0885c291d18a031ec
7.583927
0
0.247
false
false
false
false
1.004172
0.253517
25.351733
0.316841
5.004029
0.032477
3.247734
0.268456
2.46085
0.33825
1.314583
0.173122
8.124631
false
false
2024-12-31
0
Removed
Daemontatox_MawaredT1_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/Daemontatox/MawaredT1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Daemontatox/MawaredT1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Daemontatox__MawaredT1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Daemontatox/MawaredT1
84a1d35d91b862a5cfc65988d4a0f65033b34c47
29.231298
apache-2.0
1
7.616
true
false
false
false
1.276958
0.41988
41.988036
0.521482
31.900788
0.302115
30.21148
0.334732
11.297539
0.470208
18.676042
0.471825
41.313904
false
false
2025-01-02
2025-01-02
2
arcee-ai/Meraj-Mini (Merge)
Daemontatox_PathFinderAI2.0_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/Daemontatox/PathFinderAI2.0" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Daemontatox/PathFinderAI2.0</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Daemontatox__PathFinderAI2.0-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Daemontatox/PathFinderAI2.0
bf8cfd82d4ceceb133058a78e1fe48436b50568a
36.256652
apache-2.0
0
32.764
true
false
false
true
14.003082
0.454102
45.410178
0.665823
52.956513
0.507553
50.755287
0.302013
6.935123
0.421563
10.961979
0.554688
50.520833
false
false
2024-12-30
2025-01-21
4
Qwen/Qwen2.5-32B
Daemontatox_AetherTOT_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MllamaForConditionalGeneration
<a target="_blank" href="https://huggingface.co/Daemontatox/AetherTOT" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Daemontatox/AetherTOT</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Daemontatox__AetherTOT-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Daemontatox/AetherTOT
71d99f8fb69276422daae61222e57087000c05b0
23.178825
apache-2.0
0
10.67
true
false
false
false
1.397847
0.439764
43.976427
0.506606
29.436391
0.148792
14.879154
0.323826
9.8434
0.407854
9.781771
0.380402
31.155807
false
false
2024-12-27
2024-12-28
2
meta-llama/Llama-3.2-11B-Vision-Instruct
Daemontatox_Cogito-MIS_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/Daemontatox/Cogito-MIS" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Daemontatox/Cogito-MIS</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Daemontatox__Cogito-MIS-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Daemontatox/Cogito-MIS
c1d59d3bc93d7ae4816800e37333f375e1debabf
11.081962
0
23.572
false
false
false
true
1.765364
0.181452
18.145188
0.505998
29.07597
0.086103
8.610272
0.256711
0.894855
0.37676
4.928385
0.143534
4.837101
false
false
2025-02-18
0
Removed
Daemontatox_Phi-4-COT_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Daemontatox/Phi-4-COT" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Daemontatox/Phi-4-COT</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Daemontatox__Phi-4-COT-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Daemontatox/Phi-4-COT
bfc745d1a347b74843671eb50687c2e88c07ec7d
26.128818
0
14.66
false
false
false
false
1.715153
0.179303
17.930314
0.617293
45.34299
0.22432
22.432024
0.33557
11.409396
0.453
15.158333
0.500499
44.499852
false
false
2025-01-11
0
Removed
Daemontatox_Zirel_1.5_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/Daemontatox/Zirel_1.5" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Daemontatox/Zirel_1.5</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Daemontatox__Zirel_1.5-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Daemontatox/Zirel_1.5
53af159f98d8b428e719287f759500f95b601ee2
14.243506
apache-2.0
0
1.544
true
false
false
true
0.579531
0.416758
41.675754
0.398467
15.082126
0.113293
11.329305
0.260067
1.342282
0.365813
3.326562
0.214345
12.705009
false
false
2025-03-04
2025-03-04
3
Qwen/Qwen2.5-Coder-1.5B-Instruct (Merge)
Daemontatox_mini_Pathfinder_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/Daemontatox/mini_Pathfinder" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Daemontatox/mini_Pathfinder</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Daemontatox__mini_Pathfinder-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Daemontatox/mini_Pathfinder
20d12c01e831675a563c978900bcf291def5f7dd
19.872595
0
7.616
false
false
false
true
1.548263
0.296158
29.615753
0.395569
16.030028
0.475076
47.507553
0.258389
1.118568
0.378094
4.861719
0.280918
20.10195
false
false
2025-01-20
0
Removed
Daemontatox_Mini_QwQ_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/Daemontatox/Mini_QwQ" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Daemontatox/Mini_QwQ</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Daemontatox__Mini_QwQ-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Daemontatox/Mini_QwQ
e96df7ba6e989ee286da5d0b05a84525fdb56c53
30.832499
0
7.616
false
false
false
false
1.317404
0.449706
44.970567
0.554899
36.210285
0.419184
41.918429
0.303691
7.158837
0.46825
17.264583
0.437251
37.472296
false
false
2025-01-16
0
Removed
Daemontatox_RA_Reasoner_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Daemontatox/RA_Reasoner" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Daemontatox/RA_Reasoner</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Daemontatox__RA_Reasoner-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Daemontatox/RA_Reasoner
e799c6877cb70b6e78c1e337eaa58383040c8fa9
29.208003
apache-2.0
2
10.306
true
false
false
false
1.558147
0.559215
55.92151
0.605369
43.073008
0.212236
21.223565
0.331376
10.850112
0.396354
7.510938
0.43002
36.668883
false
false
2024-12-20
2024-12-25
2
tiiuae/Falcon3-10B-Base
Daemontatox_RA_Reasoner2.0_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Daemontatox/RA_Reasoner2.0" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Daemontatox/RA_Reasoner2.0</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Daemontatox__RA_Reasoner2.0-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Daemontatox/RA_Reasoner2.0
2a7477f34b171d2ae090e57abdbd997546dee242
29.039667
apache-2.0
0
10.306
true
false
false
false
1.573513
0.536634
53.663391
0.606247
43.070069
0.231118
23.111782
0.324664
9.955257
0.388354
7.177604
0.435339
37.2599
false
false
2024-12-29
2024-12-29
3
tiiuae/Falcon3-10B-Base
Daemontatox_Research_PathfinderAI_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/Daemontatox/Research_PathfinderAI" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Daemontatox/Research_PathfinderAI</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Daemontatox__Research_PathfinderAI-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Daemontatox/Research_PathfinderAI
eae32cc9dffa3a2493fd793f7b847e7bb3376853
9.365879
0
1.777
false
false
false
true
0.618841
0.345692
34.569165
0.287226
1.426346
0.16994
16.993958
0.240772
0
0.339396
1.757812
0.113032
1.447991
false
false
2025-02-21
0
Removed
Daemontatox_Zirel-7B-Math_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/Daemontatox/Zirel-7B-Math" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Daemontatox/Zirel-7B-Math</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Daemontatox__Zirel-7B-Math-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Daemontatox/Zirel-7B-Math
104d5e9f5df50c0782ff1a830f7ec3c4943210f3
30.976625
apache-2.0
0
7.616
true
false
false
true
0.538753
0.663879
66.387851
0.54477
34.939441
0.197885
19.78852
0.326342
10.178971
0.478917
18.597917
0.423703
35.967051
false
false
2025-02-28
2025-02-28
3
Qwen/Qwen2.5-7B
Daemontatox_PixelParse_AI_bfloat16
bfloat16
🌸 multimodal
🌸
Original
MllamaForConditionalGeneration
<a target="_blank" href="https://huggingface.co/Daemontatox/PixelParse_AI" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Daemontatox/PixelParse_AI</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Daemontatox__PixelParse_AI-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Daemontatox/PixelParse_AI
cc94604b91fc38513ca61f11dd9e1de1c3cc3b3d
22.925061
apache-2.0
0
10.67
true
false
false
false
1.400219
0.43829
43.82904
0.503431
29.031857
0.147281
14.728097
0.323826
9.8434
0.405188
9.248438
0.377826
30.869533
false
false
2024-12-27
2024-12-29
2
meta-llama/Llama-3.2-11B-Vision-Instruct
netcat420_MFANN-SFT_float16
float16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/netcat420/MFANN-SFT" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">netcat420/MFANN-SFT</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/netcat420__MFANN-SFT-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
netcat420/MFANN-SFT
247f2ce5841d38cef59b73a7f8af857627d254bf
17.932006
2
8.03
false
false
false
false
1.311601
0.368223
36.822298
0.485189
26.208533
0.059668
5.966767
0.316275
8.836689
0.372542
3.801042
0.33361
25.956708
false
false
2024-12-16
2024-12-20
1
netcat420/MFANN-SFT (Merge)
NTQAI_NxMobileLM-1.5B-SFT_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/NTQAI/NxMobileLM-1.5B-SFT" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">NTQAI/NxMobileLM-1.5B-SFT</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/NTQAI__NxMobileLM-1.5B-SFT-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
NTQAI/NxMobileLM-1.5B-SFT
c5095c4969a48999c99f0e34ba3db929a0b59b8b
18.734829
mit
3
1.544
true
false
false
true
1.678315
0.639224
63.922393
0.395718
16.162543
0.084592
8.459215
0.259228
1.230425
0.355521
2.440104
0.281749
20.194297
false
false
2025-01-15
2025-01-15
1
NTQAI/NxMobileLM-1.5B-SFT (Merge)
VAGOsolutions_SauerkrautLM-v2-14b-SFT_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/VAGOsolutions/SauerkrautLM-v2-14b-SFT" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">VAGOsolutions/SauerkrautLM-v2-14b-SFT</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/VAGOsolutions__SauerkrautLM-v2-14b-SFT-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
VAGOsolutions/SauerkrautLM-v2-14b-SFT
606ddc7819d4a5d9cd8618d5ede57e2bdd99a1ed
36.227856
apache-2.0
8
14.77
true
false
false
true
3.037849
0.694853
69.485299
0.621036
45.824351
0.32855
32.854985
0.33557
11.409396
0.417875
11.067708
0.520529
46.725399
false
false
2024-10-25
2024-11-04
1
VAGOsolutions/SauerkrautLM-v2-14b-SFT (Merge)
Ayush-Singh_Llama1B-sft-2_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Ayush-Singh/Llama1B-sft-2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Ayush-Singh/Llama1B-sft-2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Ayush-Singh__Llama1B-sft-2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Ayush-Singh/Llama1B-sft-2
8979241089bc73efdb2b89c47fcadc90586d7688
3.169323
0
1.236
false
false
false
false
0.76697
0.137438
13.743755
0.283428
1.237571
0
0
0.245805
0
0.355208
2.734375
0.111702
1.300236
false
false
2025-01-28
2025-02-05
0
Ayush-Singh/Llama1B-sft-2
BlackBeenie_Llama-3.1-8B-OpenO1-SFT-v0.1_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/BlackBeenie/Llama-3.1-8B-OpenO1-SFT-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">BlackBeenie/Llama-3.1-8B-OpenO1-SFT-v0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/BlackBeenie__Llama-3.1-8B-OpenO1-SFT-v0.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
BlackBeenie/Llama-3.1-8B-OpenO1-SFT-v0.1
35e7781b9dff5aea29576709201d641e5f44440d
21.40041
apache-2.0
1
8.03
true
false
false
true
1.462856
0.512404
51.240376
0.478745
26.03429
0.152568
15.256798
0.268456
2.46085
0.361813
5.726563
0.349152
27.683585
false
false
2024-12-28
2024-12-29
1
BlackBeenie/Llama-3.1-8B-OpenO1-SFT-v0.1 (Merge)
NbAiLab_nb-llama-3.1-8B-sft_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/NbAiLab/nb-llama-3.1-8B-sft" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">NbAiLab/nb-llama-3.1-8B-sft</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/NbAiLab__nb-llama-3.1-8B-sft-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
NbAiLab/nb-llama-3.1-8B-sft
4afbe8f228a7c10155e6687bd337499726db0604
8.180261
llama3.1
0
8.03
true
false
false
true
1.47617
0.361578
36.157839
0.328151
5.952498
0.021903
2.190332
0.254195
0.559284
0.328729
1.757812
0.122174
2.4638
false
false
2024-11-25
2024-12-11
0
NbAiLab/nb-llama-3.1-8B-sft
vihangd_smart-dan-sft-v0.1_4bit
4bit
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/vihangd/smart-dan-sft-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">vihangd/smart-dan-sft-v0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/vihangd__smart-dan-sft-v0.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
vihangd/smart-dan-sft-v0.1
924b4a09153d4061fa9d58f03b10cd7cde7e3084
3.871213
apache-2.0
0
0.379
true
false
false
false
0.722049
0.157646
15.764616
0.306177
3.125599
0.009819
0.981873
0.255034
0.671141
0.350188
1.106771
0.114195
1.577275
false
false
2024-08-09
2024-08-20
0
vihangd/smart-dan-sft-v0.1
DebateLabKIT_Llama-3.1-Argunaut-1-8B-SFT_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/DebateLabKIT/Llama-3.1-Argunaut-1-8B-SFT" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">DebateLabKIT/Llama-3.1-Argunaut-1-8B-SFT</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/DebateLabKIT__Llama-3.1-Argunaut-1-8B-SFT-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
DebateLabKIT/Llama-3.1-Argunaut-1-8B-SFT
e9d7396bc0fa3d1ff4c1f4b1a0d81a1d1a7e977c
24.113556
llama3.1
6
8.03
true
false
false
true
1.433957
0.551921
55.192112
0.482383
27.187827
0.145015
14.501511
0.283557
4.474273
0.450302
15.854427
0.347241
27.471188
false
false
2024-12-31
2025-01-02
1
DebateLabKIT/Llama-3.1-Argunaut-1-8B-SFT (Merge)
allenai_Llama-3.1-Tulu-3-8B-SFT_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/allenai/Llama-3.1-Tulu-3-8B-SFT" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">allenai/Llama-3.1-Tulu-3-8B-SFT</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/allenai__Llama-3.1-Tulu-3-8B-SFT-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
allenai/Llama-3.1-Tulu-3-8B-SFT
4ddd761e6750e04ea3d468175f78463628bba860
22.596941
llama3.1
29
8.03
true
false
false
true
1.366493
0.74034
74.034008
0.387186
13.931208
0.117825
11.782477
0.277685
3.691275
0.426771
12.013021
0.281167
20.129654
false
true
2024-11-18
2024-11-22
1
allenai/Llama-3.1-Tulu-3-8B-SFT (Merge)
realtreetune_rho-1b-sft-MATH_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/realtreetune/rho-1b-sft-MATH" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">realtreetune/rho-1b-sft-MATH</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/realtreetune__rho-1b-sft-MATH-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
realtreetune/rho-1b-sft-MATH
b5f93df6af679a860caac9a9598e0f70c326b4fb
5.569175
0
1.1
false
false
false
false
0.556268
0.212102
21.210167
0.314415
4.197623
0.034743
3.47432
0.252517
0.33557
0.345844
2.897135
0.111702
1.300236
false
false
2024-06-06
2024-10-05
1
realtreetune/rho-1b-sft-MATH (Merge)
CultriX_Qwen2.5-14B-Wernicke-SFT_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/CultriX/Qwen2.5-14B-Wernicke-SFT" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">CultriX/Qwen2.5-14B-Wernicke-SFT</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/CultriX__Qwen2.5-14B-Wernicke-SFT-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
CultriX/Qwen2.5-14B-Wernicke-SFT
3b68dfba2cf79e4a15e8f4271f7d4b62d2ab9f26
33.549512
apache-2.0
2
14.77
true
false
false
true
2.786025
0.493744
49.374438
0.646059
49.330572
0.359517
35.951662
0.354027
13.870246
0.39
7.55
0.506981
45.220154
true
false
2024-11-16
2024-11-17
1
CultriX/Qwen2.5-14B-Wernicke-SFT (Merge)
OpenAssistant_oasst-sft-1-pythia-12b_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
GPTNeoXForCausalLM
<a target="_blank" href="https://huggingface.co/OpenAssistant/oasst-sft-1-pythia-12b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">OpenAssistant/oasst-sft-1-pythia-12b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/OpenAssistant__oasst-sft-1-pythia-12b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
OpenAssistant/oasst-sft-1-pythia-12b
293df535fe7711a5726987fc2f17dfc87de452a1
3.68183
apache-2.0
278
12
true
false
false
false
1.776114
0.105539
10.553886
0.314663
4.778509
0.015106
1.510574
0.25755
1.006711
0.332698
2.98724
0.111287
1.254063
false
true
2023-03-09
2024-06-12
0
OpenAssistant/oasst-sft-1-pythia-12b
allenai_Llama-3.1-Tulu-3-70B-SFT_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/allenai/Llama-3.1-Tulu-3-70B-SFT" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">allenai/Llama-3.1-Tulu-3-70B-SFT</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/allenai__Llama-3.1-Tulu-3-70B-SFT-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
allenai/Llama-3.1-Tulu-3-70B-SFT
f58ab66db3a1c5dd805c6d3420b2b4f5aef30041
38.848492
llama3.1
6
70.554
true
false
false
true
54.676654
0.805062
80.506168
0.595144
42.023984
0.331571
33.1571
0.344799
12.639821
0.502615
24.49349
0.462434
40.27039
false
true
2024-11-18
2024-11-27
1
allenai/Llama-3.1-Tulu-3-70B-SFT (Merge)
kz919_QwQ-0.5B-Distilled-SFT_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/kz919/QwQ-0.5B-Distilled-SFT" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">kz919/QwQ-0.5B-Distilled-SFT</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/kz919__QwQ-0.5B-Distilled-SFT-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
kz919/QwQ-0.5B-Distilled-SFT
06b5127157cad87614a851f7b7b2ec2a9b8bd49d
9.089107
apache-2.0
23
0.494
true
false
false
true
1.019595
0.307673
30.767253
0.325629
7.277629
0.074018
7.401813
0.260906
1.454139
0.340854
1.106771
0.158743
6.527039
false
false
2025-01-05
2025-01-10
1
kz919/QwQ-0.5B-Distilled-SFT (Merge)
prithivMLmods_Llama-3.1-8B-Open-SFT_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/prithivMLmods/Llama-3.1-8B-Open-SFT" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">prithivMLmods/Llama-3.1-8B-Open-SFT</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/prithivMLmods__Llama-3.1-8B-Open-SFT-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
prithivMLmods/Llama-3.1-8B-Open-SFT
e5d7fa281735f7fcc09fdb5810a2118789040d67
21.043704
creativeml-openrail-m
15
8.03
true
false
false
false
1.455588
0.412262
41.226169
0.496798
28.179928
0.121601
12.160121
0.309564
7.941834
0.390365
8.728906
0.352227
28.025266
false
false
2024-12-18
2025-01-12
1
prithivMLmods/Llama-3.1-8B-Open-SFT (Merge)
sthenno_tempesthenno-sft-0309-ckpt10_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/sthenno/tempesthenno-sft-0309-ckpt10" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">sthenno/tempesthenno-sft-0309-ckpt10</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/sthenno__tempesthenno-sft-0309-ckpt10-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
sthenno/tempesthenno-sft-0309-ckpt10
e13c4281c3cccf9fded2ec8c3b2ef6d24c906403
42.192397
apache-2.0
1
14.766
true
false
false
true
1.550085
0.774362
77.436203
0.655165
50.600903
0.472054
47.205438
0.371644
16.219239
0.436417
14.385417
0.525765
47.307181
false
false
2025-03-08
2025-03-08
1
sthenno/tempesthenno-sft-0309-ckpt10 (Merge)
thinkcoder_llama3-8b-instruct-lora-8-sft_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/thinkcoder/llama3-8b-instruct-lora-8-sft" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">thinkcoder/llama3-8b-instruct-lora-8-sft</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/thinkcoder__llama3-8b-instruct-lora-8-sft-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
thinkcoder/llama3-8b-instruct-lora-8-sft
b76d81a09b15d92f92a8a22711983775ac999383
22.363644
0
8.03
false
false
false
true
0.71498
0.648042
64.804164
0.486501
27.203773
0.101964
10.196375
0.266779
2.237136
0.323458
2.232292
0.347573
27.508126
false
false
2025-03-10
2025-03-10
0
thinkcoder/llama3-8b-instruct-lora-8-sft
Columbia-NLP_LION-Gemma-2b-sft-v1.0_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
GemmaForCausalLM
<a target="_blank" href="https://huggingface.co/Columbia-NLP/LION-Gemma-2b-sft-v1.0" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Columbia-NLP/LION-Gemma-2b-sft-v1.0</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Columbia-NLP__LION-Gemma-2b-sft-v1.0-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Columbia-NLP/LION-Gemma-2b-sft-v1.0
44d6f26fa7e3b0d238064d844569bf8a07b7515e
12.60325
0
2.506
false
false
false
true
1.921618
0.369247
36.924693
0.387878
14.117171
0.067976
6.797583
0.255872
0.782998
0.40274
8.309115
0.178191
8.687943
false
false
2024-07-02
2024-07-04
0
Columbia-NLP/LION-Gemma-2b-sft-v1.0
AALF_FuseChat-Llama-3.1-8B-SFT-preview_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/AALF/FuseChat-Llama-3.1-8B-SFT-preview" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">AALF/FuseChat-Llama-3.1-8B-SFT-preview</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/AALF__FuseChat-Llama-3.1-8B-SFT-preview-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
AALF/FuseChat-Llama-3.1-8B-SFT-preview
601f2b8c448acc5686656d3979ed732ce050b827
29.225292
1
8.03
false
false
false
true
1.368615
0.72805
72.805046
0.52403
32.536782
0.225076
22.507553
0.30453
7.270694
0.402
9.75
0.374335
30.481678
false
false
2024-11-20
2024-11-21
0
AALF/FuseChat-Llama-3.1-8B-SFT-preview
JayHyeon_Qwen2.5-0.5B-SFT-DPO-1epoch_v1_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2Model
<a target="_blank" href="https://huggingface.co/JayHyeon/Qwen2.5-0.5B-SFT-DPO-1epoch_v1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">JayHyeon/Qwen2.5-0.5B-SFT-DPO-1epoch_v1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/JayHyeon__Qwen2.5-0.5B-SFT-DPO-1epoch_v1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
JayHyeon/Qwen2.5-0.5B-SFT-DPO-1epoch_v1
0193e5d0320c7810011a6d9574e8657706eac706
6.245598
mit
0
0.494
true
false
false
true
0.961643
0.202459
20.245947
0.326814
6.136574
0.036254
3.625378
0.272651
3.020134
0.320917
0.78125
0.132979
3.664303
false
false
2024-12-28
2024-12-28
0
JayHyeon/Qwen2.5-0.5B-SFT-DPO-1epoch_v1
openbmb_MiniCPM-S-1B-sft-llama-format_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/openbmb/MiniCPM-S-1B-sft-llama-format" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">openbmb/MiniCPM-S-1B-sft-llama-format</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/openbmb__MiniCPM-S-1B-sft-llama-format-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
openbmb/MiniCPM-S-1B-sft-llama-format
7de07f8895c168a7ee01f624f50c44f6966c9735
8.996066
apache-2.0
4
1
true
false
false
true
1.080074
0.332877
33.287677
0.304931
3.898455
0.030967
3.096677
0.270973
2.796421
0.331677
1.359635
0.185838
9.53753
false
false
2024-06-14
2024-11-19
0
openbmb/MiniCPM-S-1B-sft-llama-format
Columbia-NLP_LION-LLaMA-3-8b-sft-v1.0_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Columbia-NLP/LION-LLaMA-3-8b-sft-v1.0" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Columbia-NLP/LION-LLaMA-3-8b-sft-v1.0</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Columbia-NLP__LION-LLaMA-3-8b-sft-v1.0-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Columbia-NLP/LION-LLaMA-3-8b-sft-v1.0
822eddb2fd127178d9fb7bb9f4fca0e93ada2836
20.748862
0
8.03
false
false
false
true
1.507226
0.381712
38.171164
0.508777
30.88426
0.114048
11.404834
0.277685
3.691275
0.450271
15.483854
0.32372
24.857787
false
false
2024-07-02
2024-07-04
0
Columbia-NLP/LION-LLaMA-3-8b-sft-v1.0
JayHyeon_Qwen2.5-0.5B-SFT-MDPO-1epoch_v1_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2Model
<a target="_blank" href="https://huggingface.co/JayHyeon/Qwen2.5-0.5B-SFT-MDPO-1epoch_v1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">JayHyeon/Qwen2.5-0.5B-SFT-MDPO-1epoch_v1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/JayHyeon__Qwen2.5-0.5B-SFT-MDPO-1epoch_v1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
JayHyeon/Qwen2.5-0.5B-SFT-MDPO-1epoch_v1
9d73894bc39e9d994cdd154ef006eeb9a1b06b1b
6.644464
mit
0
0.494
true
false
false
true
0.960421
0.196414
19.64144
0.329258
6.524722
0.046828
4.682779
0.276007
3.467562
0.326156
1.802865
0.133727
3.747414
false
false
2024-12-27
2024-12-27
0
JayHyeon/Qwen2.5-0.5B-SFT-MDPO-1epoch_v1
princeton-nlp_Llama-3-Base-8B-SFT_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/princeton-nlp/Llama-3-Base-8B-SFT" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">princeton-nlp/Llama-3-Base-8B-SFT</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/princeton-nlp__Llama-3-Base-8B-SFT-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
princeton-nlp/Llama-3-Base-8B-SFT
b622b7d814aa03aa722328bf88feaf1ad480b7fb
15.964206
2
8.03
true
false
false
true
2.62091
0.279596
27.959592
0.464304
24.345967
0.04003
4.003021
0.297819
6.375839
0.411792
9.840625
0.309342
23.260195
false
false
2024-05-17
2024-10-07
0
princeton-nlp/Llama-3-Base-8B-SFT
princeton-nlp_Mistral-7B-Base-SFT-RDPO_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/princeton-nlp/Mistral-7B-Base-SFT-RDPO" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">princeton-nlp/Mistral-7B-Base-SFT-RDPO</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/princeton-nlp__Mistral-7B-Base-SFT-RDPO-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
princeton-nlp/Mistral-7B-Base-SFT-RDPO
2a63a6d9e1978c99444e440371268f7c2b7e0375
16.490934
0
7.242
true
false
false
true
1.32501
0.460647
46.064664
0.443953
22.98201
0.021903
2.190332
0.277685
3.691275
0.357938
4.275521
0.277676
19.7418
false
false
2024-05-17
2024-10-07
0
princeton-nlp/Mistral-7B-Base-SFT-RDPO
ontocord_wide_3b_sft_stage1.2-ss1-expert_how-to_float16
float16
🟢 pretrained
🟢
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/ontocord/wide_3b_sft_stage1.2-ss1-expert_how-to" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ontocord/wide_3b_sft_stage1.2-ss1-expert_how-to</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ontocord__wide_3b_sft_stage1.2-ss1-expert_how-to-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ontocord/wide_3b_sft_stage1.2-ss1-expert_how-to
2a9729fe267aa6a070236ebde081d11b02d4b42b
4.21527
apache-2.0
0
3.759
true
false
false
true
0.535195
0.124548
12.454842
0.30474
3.814087
0.01435
1.435045
0.259228
1.230425
0.365813
4.659896
0.115276
1.697326
false
false
2025-03-06
2025-03-06
0
ontocord/wide_3b_sft_stage1.2-ss1-expert_how-to
princeton-nlp_Llama-3-Base-8B-SFT-IPO_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/princeton-nlp/Llama-3-Base-8B-SFT-IPO" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">princeton-nlp/Llama-3-Base-8B-SFT-IPO</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/princeton-nlp__Llama-3-Base-8B-SFT-IPO-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
princeton-nlp/Llama-3-Base-8B-SFT-IPO
85055cc4b9c707e0bd1239d20d1f62927a7a54c3
18.722473
0
8.03
true
false
false
true
1.864382
0.448656
44.865623
0.469007
25.705433
0.039275
3.927492
0.297819
6.375839
0.391948
7.960156
0.311503
23.500296
false
false
2024-05-17
2024-10-07
0
princeton-nlp/Llama-3-Base-8B-SFT-IPO
princeton-nlp_Llama-3-Base-8B-SFT-ORPO_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/princeton-nlp/Llama-3-Base-8B-SFT-ORPO" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">princeton-nlp/Llama-3-Base-8B-SFT-ORPO</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/princeton-nlp__Llama-3-Base-8B-SFT-ORPO-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
princeton-nlp/Llama-3-Base-8B-SFT-ORPO
54d58402e0168faff6503e41621ad6c8274a310a
19.268326
0
8.03
true
false
false
true
1.813126
0.451654
45.165383
0.473406
26.485894
0.046828
4.682779
0.313758
8.501119
0.370677
7.634635
0.308261
23.140145
false
false
2024-05-17
2024-10-07
0
princeton-nlp/Llama-3-Base-8B-SFT-ORPO
princeton-nlp_Mistral-7B-Base-SFT-CPO_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/princeton-nlp/Mistral-7B-Base-SFT-CPO" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">princeton-nlp/Mistral-7B-Base-SFT-CPO</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/princeton-nlp__Mistral-7B-Base-SFT-CPO-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
princeton-nlp/Mistral-7B-Base-SFT-CPO
7f67394668b94a9ddfb64daff8976b48b135d96c
17.39897
1
7.242
true
false
false
true
1.619538
0.465493
46.549267
0.438215
21.857696
0.027946
2.794562
0.291946
5.592841
0.407083
9.252083
0.265126
18.34737
false
false
2024-07-06
2024-10-07
0
princeton-nlp/Mistral-7B-Base-SFT-CPO
princeton-nlp_Llama-3-Base-8B-SFT-CPO_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/princeton-nlp/Llama-3-Base-8B-SFT-CPO" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">princeton-nlp/Llama-3-Base-8B-SFT-CPO</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/princeton-nlp__Llama-3-Base-8B-SFT-CPO-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
princeton-nlp/Llama-3-Base-8B-SFT-CPO
536ce7e7beb35175c48538fe46e7e9e100f228c9
15.953789
0
8.03
true
false
false
true
1.935692
0.370346
37.034624
0.459488
25.474649
0.054381
5.438066
0.274329
3.243848
0.360854
2.573438
0.297623
21.958112
false
false
2024-07-06
2024-10-07
0
princeton-nlp/Llama-3-Base-8B-SFT-CPO
princeton-nlp_Llama-3-Base-8B-SFT-DPO_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/princeton-nlp/Llama-3-Base-8B-SFT-DPO" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">princeton-nlp/Llama-3-Base-8B-SFT-DPO</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/princeton-nlp__Llama-3-Base-8B-SFT-DPO-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
princeton-nlp/Llama-3-Base-8B-SFT-DPO
3f5ec47c9beffb37cfbdcd837e76a336a9b1e651
18.376219
0
8.03
true
false
false
true
1.85268
0.411113
41.111251
0.466585
26.001874
0.041541
4.154079
0.310403
8.053691
0.38674
7.842448
0.307846
23.093972
false
false
2024-05-17
2024-10-07
0
princeton-nlp/Llama-3-Base-8B-SFT-DPO
princeton-nlp_Llama-3-Base-8B-SFT-RRHF_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/princeton-nlp/Llama-3-Base-8B-SFT-RRHF" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">princeton-nlp/Llama-3-Base-8B-SFT-RRHF</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/princeton-nlp__Llama-3-Base-8B-SFT-RRHF-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
princeton-nlp/Llama-3-Base-8B-SFT-RRHF
aea8c04b3940cebd1f8296a2c76914f0ce70c276
16.282724
0
8.03
true
false
false
true
1.902937
0.335725
33.572477
0.452036
23.659142
0.045317
4.531722
0.305369
7.38255
0.372229
7.561979
0.288896
20.988475
false
false
2024-07-06
2024-10-07
0
princeton-nlp/Llama-3-Base-8B-SFT-RRHF
princeton-nlp_Mistral-7B-Base-SFT-DPO_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/princeton-nlp/Mistral-7B-Base-SFT-DPO" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">princeton-nlp/Mistral-7B-Base-SFT-DPO</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/princeton-nlp__Mistral-7B-Base-SFT-DPO-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
princeton-nlp/Mistral-7B-Base-SFT-DPO
17134fd80cfbf3980353967a30dc6f450f18f78f
16.311854
0
7.242
true
false
false
true
1.335239
0.440338
44.03383
0.435011
20.79098
0.021148
2.114804
0.272651
3.020134
0.412229
9.628646
0.264545
18.282728
false
false
2024-05-17
2024-10-07
0
princeton-nlp/Mistral-7B-Base-SFT-DPO
princeton-nlp_Mistral-7B-Base-SFT-RRHF_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/princeton-nlp/Mistral-7B-Base-SFT-RRHF" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">princeton-nlp/Mistral-7B-Base-SFT-RRHF</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/princeton-nlp__Mistral-7B-Base-SFT-RRHF-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
princeton-nlp/Mistral-7B-Base-SFT-RRHF
0d5861072e9d01f420451bf6a5b108bc8d3a76bc
16.182025
0
7.242
true
false
false
true
1.338002
0.440663
44.0663
0.428059
19.598831
0.024924
2.492447
0.290268
5.369128
0.418677
10.034635
0.239777
15.530807
false
false
2024-07-06
2024-10-07
0
princeton-nlp/Mistral-7B-Base-SFT-RRHF
princeton-nlp_Mistral-7B-Base-SFT-SimPO_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/princeton-nlp/Mistral-7B-Base-SFT-SimPO" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">princeton-nlp/Mistral-7B-Base-SFT-SimPO</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/princeton-nlp__Mistral-7B-Base-SFT-SimPO-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
princeton-nlp/Mistral-7B-Base-SFT-SimPO
9d9e8b8de4f673d45bc826efc4a1444f9d480222
17.032015
0
7.242
true
false
false
true
1.271413
0.470064
47.006387
0.439805
22.332886
0.01435
1.435045
0.283557
4.474273
0.397063
8.032813
0.270196
18.910683
false
false
2024-05-17
2024-09-21
0
princeton-nlp/Mistral-7B-Base-SFT-SimPO
JayHyeon_Qwen2.5-0.5B-SFT-7e-5_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/JayHyeon/Qwen2.5-0.5B-SFT-7e-5" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">JayHyeon/Qwen2.5-0.5B-SFT-7e-5</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/JayHyeon__Qwen2.5-0.5B-SFT-7e-5-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
JayHyeon/Qwen2.5-0.5B-SFT-7e-5
49a3b8378a402807b855c0119e034098abe0d46f
6.761335
mit
0
0.63
true
false
false
false
1.681885
0.209254
20.925367
0.315818
6.357842
0.030211
3.021148
0.256711
0.894855
0.336698
2.453906
0.162234
6.914894
false
false
2024-12-28
2024-12-28
1
Qwen/Qwen2.5-0.5B
JayHyeon_Qwen2.5-0.5B-Instruct-SFT-IRPO-1epoch_v1_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2Model
<a target="_blank" href="https://huggingface.co/JayHyeon/Qwen2.5-0.5B-Instruct-SFT-IRPO-1epoch_v1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">JayHyeon/Qwen2.5-0.5B-Instruct-SFT-IRPO-1epoch_v1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/JayHyeon__Qwen2.5-0.5B-Instruct-SFT-IRPO-1epoch_v1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
JayHyeon/Qwen2.5-0.5B-Instruct-SFT-IRPO-1epoch_v1
c469240bdc78d707215b4e58d12a72c7b75abfb3
8.319017
mit
0
0.494
true
false
false
true
1.001503
0.260586
26.058636
0.330803
6.622365
0.049849
4.984894
0.280201
4.026846
0.328823
1.269531
0.162566
6.951832
false
false
2024-12-26
2024-12-26
0
JayHyeon/Qwen2.5-0.5B-Instruct-SFT-IRPO-1epoch_v1
JayHyeon_Qwen2.5-0.5B-Instruct-SFT-MDPO-1epoch_v1_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2Model
<a target="_blank" href="https://huggingface.co/JayHyeon/Qwen2.5-0.5B-Instruct-SFT-MDPO-1epoch_v1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">JayHyeon/Qwen2.5-0.5B-Instruct-SFT-MDPO-1epoch_v1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/JayHyeon__Qwen2.5-0.5B-Instruct-SFT-MDPO-1epoch_v1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
JayHyeon/Qwen2.5-0.5B-Instruct-SFT-MDPO-1epoch_v1
77f99c1b2a2d32c84d0cd986eb952927c3b77497
7.89622
mit
0
0.494
true
false
false
true
0.995978
0.252918
25.291781
0.326195
6.129987
0.056647
5.664653
0.268456
2.46085
0.330125
1.432292
0.15758
6.397754
false
false
2024-12-26
2024-12-26
0
JayHyeon/Qwen2.5-0.5B-Instruct-SFT-MDPO-1epoch_v1
PJMixers-Dev_LLaMa-3.2-Instruct-JankMix-v0.2-SFT-3B_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/PJMixers-Dev/LLaMa-3.2-Instruct-JankMix-v0.2-SFT-3B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">PJMixers-Dev/LLaMa-3.2-Instruct-JankMix-v0.2-SFT-3B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/PJMixers-Dev__LLaMa-3.2-Instruct-JankMix-v0.2-SFT-3B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
PJMixers-Dev/LLaMa-3.2-Instruct-JankMix-v0.2-SFT-3B
4c348a8dfc1be0b4985e0ed2882329515a60c19d
21.772674
llama3.2
1
3.213
true
false
false
true
1.419798
0.629157
62.91573
0.45815
23.34124
0.129909
12.990937
0.272651
3.020134
0.365875
4.867708
0.311503
23.500296
false
false
2024-10-14
2024-10-14
1
PJMixers-Dev/LLaMa-3.2-Instruct-JankMix-v0.2-SFT-3B (Merge)
princeton-nlp_Llama-3-Base-8B-SFT-SimPO_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/princeton-nlp/Llama-3-Base-8B-SFT-SimPO" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">princeton-nlp/Llama-3-Base-8B-SFT-SimPO</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/princeton-nlp__Llama-3-Base-8B-SFT-SimPO-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
princeton-nlp/Llama-3-Base-8B-SFT-SimPO
0a6e518b13b67abe8433bce3f7beee9beb74a794
19.858509
0
8.03
false
false
false
true
1.72313
0.46854
46.854014
0.474125
26.39595
0.055136
5.513595
0.288591
5.145414
0.412688
11.852604
0.310505
23.38948
false
false
2024-05-24
2024-09-28
0
princeton-nlp/Llama-3-Base-8B-SFT-SimPO
princeton-nlp_Mistral-7B-Base-SFT-KTO_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/princeton-nlp/Mistral-7B-Base-SFT-KTO" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">princeton-nlp/Mistral-7B-Base-SFT-KTO</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/princeton-nlp__Mistral-7B-Base-SFT-KTO-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
princeton-nlp/Mistral-7B-Base-SFT-KTO
02148bb9241b0f4bb0c75e93893eed005abe25e8
19.012992
0
7.242
true
false
false
true
1.332033
0.478482
47.848154
0.447643
23.107642
0.039275
3.927492
0.290268
5.369128
0.436781
13.03099
0.287151
20.794548
false
false
2024-05-17
2024-10-07
0
princeton-nlp/Mistral-7B-Base-SFT-KTO
princeton-nlp_Llama-3-Base-8B-SFT-KTO_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/princeton-nlp/Llama-3-Base-8B-SFT-KTO" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">princeton-nlp/Llama-3-Base-8B-SFT-KTO</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/princeton-nlp__Llama-3-Base-8B-SFT-KTO-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
princeton-nlp/Llama-3-Base-8B-SFT-KTO
49a8c2e5ccc7a28ed7bbedf093e352015fc1eb9b
18.644616
0
8.03
true
false
false
true
1.723704
0.452253
45.225335
0.469285
25.55523
0.05287
5.287009
0.305369
7.38255
0.384198
5.591406
0.305436
22.826167
false
false
2024-05-17
2024-10-07
0
princeton-nlp/Llama-3-Base-8B-SFT-KTO
princeton-nlp_Llama-3-Base-8B-SFT-RDPO_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/princeton-nlp/Llama-3-Base-8B-SFT-RDPO" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">princeton-nlp/Llama-3-Base-8B-SFT-RDPO</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/princeton-nlp__Llama-3-Base-8B-SFT-RDPO-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
princeton-nlp/Llama-3-Base-8B-SFT-RDPO
b41a964c2135ba34dcc6fa7edf76b6b9ea656949
19.142302
0
8.03
true
false
false
true
1.804871
0.448007
44.800684
0.466201
25.526521
0.057402
5.740181
0.306208
7.494407
0.40274
8.909115
0.301446
22.382905
false
false
2024-05-17
2024-10-07
0
princeton-nlp/Llama-3-Base-8B-SFT-RDPO
princeton-nlp_Mistral-7B-Base-SFT-IPO_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/princeton-nlp/Mistral-7B-Base-SFT-IPO" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">princeton-nlp/Mistral-7B-Base-SFT-IPO</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/princeton-nlp__Mistral-7B-Base-SFT-IPO-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
princeton-nlp/Mistral-7B-Base-SFT-IPO
eea781724e4d2ab8bdda7c13526f042de4cfae41
17.273368
0
7.242
true
false
false
true
1.334669
0.482953
48.295301
0.445802
23.703491
0.028701
2.870091
0.280201
4.026846
0.377625
4.836458
0.279172
19.908023
false
false
2024-05-17
2024-10-07
0
princeton-nlp/Mistral-7B-Base-SFT-IPO
JayHyeon_Qwen2.5-0.5B-SFT_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/JayHyeon/Qwen2.5-0.5B-SFT" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">JayHyeon/Qwen2.5-0.5B-SFT</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/JayHyeon__Qwen2.5-0.5B-SFT-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
JayHyeon/Qwen2.5-0.5B-SFT
804606871a044917289c9ea22d335a80a0708cb6
6.611059
mit
0
0.63
true
false
false
false
1.727674
0.196365
19.636453
0.312075
4.434475
0.02719
2.719033
0.278523
3.803132
0.339427
1.595052
0.167304
7.478206
false
false
2024-12-27
2024-12-27
1
Qwen/Qwen2.5-0.5B
JayHyeon_Qwen2.5-0.5B-SFT-1e-5_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/JayHyeon/Qwen2.5-0.5B-SFT-1e-5" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">JayHyeon/Qwen2.5-0.5B-SFT-1e-5</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/JayHyeon__Qwen2.5-0.5B-SFT-1e-5-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
JayHyeon/Qwen2.5-0.5B-SFT-1e-5
7a00c50afd13d3020b200733e87309ea81126501
6.664257
mit
0
0.63
true
false
false
false
1.716113
0.198588
19.858753
0.313986
4.213684
0.037764
3.776435
0.268456
2.46085
0.346031
1.920573
0.169797
7.755245
false
false
2024-12-28
2024-12-28
1
Qwen/Qwen2.5-0.5B
JayHyeon_Qwen2.5-0.5B-SFT-2e-5_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/JayHyeon/Qwen2.5-0.5B-SFT-2e-5" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">JayHyeon/Qwen2.5-0.5B-SFT-2e-5</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/JayHyeon__Qwen2.5-0.5B-SFT-2e-5-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
JayHyeon/Qwen2.5-0.5B-SFT-2e-5
59b7674144f30acea5e4470e29cc4d59b48d5e8e
6.96911
mit
0
0.63
true
false
false
false
1.684916
0.206756
20.675585
0.320397
4.981848
0.037009
3.700906
0.269295
2.572707
0.348667
2.35
0.167803
7.533614
false
false
2024-12-28
2024-12-28
1
Qwen/Qwen2.5-0.5B
JayHyeon_Qwen2.5-0.5B-SFT-5e-5_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/JayHyeon/Qwen2.5-0.5B-SFT-5e-5" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">JayHyeon/Qwen2.5-0.5B-SFT-5e-5</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/JayHyeon__Qwen2.5-0.5B-SFT-5e-5-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
JayHyeon/Qwen2.5-0.5B-SFT-5e-5
60e9b653fa459a1c86b5b6753fbf53c65349c6c4
6.609557
mit
0
0.63
true
false
false
false
1.713076
0.200986
20.098561
0.310938
4.746973
0.033988
3.398792
0.267617
2.348993
0.338094
1.595052
0.167221
7.468972
false
false
2024-12-28
2024-12-28
1
Qwen/Qwen2.5-0.5B
migtissera_Tess-3-7B-SFT_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/migtissera/Tess-3-7B-SFT" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">migtissera/Tess-3-7B-SFT</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/migtissera__Tess-3-7B-SFT-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
migtissera/Tess-3-7B-SFT
404de3b56564dbd43cd64d97f8574b43189462f3
17.209456
apache-2.0
4
7.248
true
false
false
true
1.29434
0.394626
39.462626
0.460735
24.123847
0.04003
4.003021
0.270973
2.796421
0.411271
10.275521
0.303358
22.595301
false
false
2024-07-09
2024-07-20
1
mistralai/Mistral-7B-v0.3
JayHyeon_Qwen2.5-0.5B-Instruct-SFT-DPO-1epoch_v1_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2Model
<a target="_blank" href="https://huggingface.co/JayHyeon/Qwen2.5-0.5B-Instruct-SFT-DPO-1epoch_v1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">JayHyeon/Qwen2.5-0.5B-Instruct-SFT-DPO-1epoch_v1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/JayHyeon__Qwen2.5-0.5B-Instruct-SFT-DPO-1epoch_v1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
JayHyeon/Qwen2.5-0.5B-Instruct-SFT-DPO-1epoch_v1
2736ff4f329f204d91dd47b8bc951945b7ccc572
8.150785
mit
0
0.494
true
false
false
true
1.004167
0.246873
24.687274
0.326031
6.126073
0.064955
6.495468
0.272651
3.020134
0.343365
2.18724
0.157497
6.38852
false
false
2024-12-26
2024-12-26
0
JayHyeon/Qwen2.5-0.5B-Instruct-SFT-DPO-1epoch_v1
PJMixers-Dev_LLaMa-3.2-Instruct-JankMix-v0.1-SFT-3B_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/PJMixers-Dev/LLaMa-3.2-Instruct-JankMix-v0.1-SFT-3B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">PJMixers-Dev/LLaMa-3.2-Instruct-JankMix-v0.1-SFT-3B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/PJMixers-Dev__LLaMa-3.2-Instruct-JankMix-v0.1-SFT-3B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
PJMixers-Dev/LLaMa-3.2-Instruct-JankMix-v0.1-SFT-3B
1286f51489b06fe67fa36d57aa87331fa37e698b
22.70174
llama3.2
0
3.213
true
false
false
true
1.427389
0.693054
69.305443
0.455617
23.808307
0.121601
12.160121
0.274329
3.243848
0.370031
4.053906
0.312749
23.638815
false
false
2024-10-12
2024-10-12
1
PJMixers-Dev/LLaMa-3.2-Instruct-JankMix-v0.1-SFT-3B (Merge)
ontocord_wide_3b_sft_stage1.1-ss1-with_math.no_issue_float16
float16
🟢 pretrained
🟢
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/ontocord/wide_3b_sft_stage1.1-ss1-with_math.no_issue" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ontocord/wide_3b_sft_stage1.1-ss1-with_math.no_issue</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ontocord__wide_3b_sft_stage1.1-ss1-with_math.no_issue-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ontocord/wide_3b_sft_stage1.1-ss1-with_math.no_issue
411920369fdde84e168c9821ffb3a9cc1a260d0c
4.7675
0
3.759
false
false
false
true
0.270395
0.129819
12.981888
0.30519
3.133656
0.015861
1.586103
0.260067
1.342282
0.39276
7.928385
0.114694
1.632683
false
false
2025-03-06
2025-03-06
0
ontocord/wide_3b_sft_stage1.1-ss1-with_math.no_issue
ontocord_wide_3b_sft_stage1.2-ss1-expert_news_float16
float16
🟢 pretrained
🟢
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/ontocord/wide_3b_sft_stage1.2-ss1-expert_news" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ontocord/wide_3b_sft_stage1.2-ss1-expert_news</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ontocord__wide_3b_sft_stage1.2-ss1-expert_news-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ontocord/wide_3b_sft_stage1.2-ss1-expert_news
303215f83d50a86121de57540d1285f592bc37ff
4.449975
0
3.759
false
false
false
false
0.263964
0.165814
16.581448
0.292588
1.943798
0.016616
1.661631
0.267617
2.348993
0.362094
2.928385
0.11112
1.235594
false
false
2025-03-05
2025-03-05
0
ontocord/wide_3b_sft_stage1.2-ss1-expert_news
JayHyeon_Qwen2.5-0.5B-SFT-1e-4_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/JayHyeon/Qwen2.5-0.5B-SFT-1e-4" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">JayHyeon/Qwen2.5-0.5B-SFT-1e-4</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/JayHyeon__Qwen2.5-0.5B-SFT-1e-4-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
JayHyeon/Qwen2.5-0.5B-SFT-1e-4
2303839a4f4f3a1ada55113c451177ad481eb647
5.941931
mit
0
0.63
true
false
false
false
1.714999
0.20196
20.195969
0.301709
4.331493
0.018882
1.888218
0.250839
0.111857
0.344635
2.246094
0.161902
6.877955
false
false
2024-12-28
2024-12-28
1
Qwen/Qwen2.5-0.5B
sonthenguyen_ft-unsloth-zephyr-sft-bnb-4bit-20241014-161415_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Adapter
?
<a target="_blank" href="https://huggingface.co/sonthenguyen/ft-unsloth-zephyr-sft-bnb-4bit-20241014-161415" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">sonthenguyen/ft-unsloth-zephyr-sft-bnb-4bit-20241014-161415</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/sonthenguyen__ft-unsloth-zephyr-sft-bnb-4bit-20241014-161415-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
sonthenguyen/ft-unsloth-zephyr-sft-bnb-4bit-20241014-161415
467eff1ac1c3395c130929bbe1f34a8194715e7c
8.889815
apache-2.0
0
7.723
true
false
false
true
3.255423
0.289338
28.933785
0.380418
12.789212
0.011329
1.132931
0.246644
0
0.386063
6.024479
0.140126
4.458481
false
false
2024-10-15
2024-10-16
1
unsloth/zephyr-sft-bnb-4bit
sonthenguyen_ft-unsloth-zephyr-sft-bnb-4bit-20241014-164205_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Adapter
?
<a target="_blank" href="https://huggingface.co/sonthenguyen/ft-unsloth-zephyr-sft-bnb-4bit-20241014-164205" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">sonthenguyen/ft-unsloth-zephyr-sft-bnb-4bit-20241014-164205</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/sonthenguyen__ft-unsloth-zephyr-sft-bnb-4bit-20241014-164205-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
sonthenguyen/ft-unsloth-zephyr-sft-bnb-4bit-20241014-164205
467eff1ac1c3395c130929bbe1f34a8194715e7c
12.932104
apache-2.0
0
7.723
true
false
false
true
3.177996
0.319938
31.993777
0.395862
16.710725
0.008308
0.830816
0.276007
3.467562
0.427177
12.097135
0.212434
12.492612
false
false
2024-10-15
2024-10-16
1
unsloth/zephyr-sft-bnb-4bit
sonthenguyen_ft-unsloth-zephyr-sft-bnb-4bit-20241014-170522_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Adapter
?
<a target="_blank" href="https://huggingface.co/sonthenguyen/ft-unsloth-zephyr-sft-bnb-4bit-20241014-170522" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">sonthenguyen/ft-unsloth-zephyr-sft-bnb-4bit-20241014-170522</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/sonthenguyen__ft-unsloth-zephyr-sft-bnb-4bit-20241014-170522-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
sonthenguyen/ft-unsloth-zephyr-sft-bnb-4bit-20241014-170522
467eff1ac1c3395c130929bbe1f34a8194715e7c
13.424509
apache-2.0
0
7.723
true
false
false
true
3.229396
0.376441
37.644118
0.382837
14.138282
0.009063
0.906344
0.265101
2.013423
0.440417
14.11875
0.205535
11.726138
false
false
2024-10-15
2024-10-16
1
unsloth/zephyr-sft-bnb-4bit
JayHyeon_Qwen2.5-0.5B-SFT-2e-4_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/JayHyeon/Qwen2.5-0.5B-SFT-2e-4" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">JayHyeon/Qwen2.5-0.5B-SFT-2e-4</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/JayHyeon__Qwen2.5-0.5B-SFT-2e-4-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
JayHyeon/Qwen2.5-0.5B-SFT-2e-4
e6754bdc0fbeb7fc8d0df3c3677c0f70f9c4d3a8
5.555176
mit
0
0.63
true
false
false
false
1.707137
0.203434
20.343356
0.293555
3.114588
0.024169
2.416918
0.25755
1.006711
0.343427
1.861719
0.14129
4.587766
false
false
2024-12-28
2024-12-28
1
Qwen/Qwen2.5-0.5B
ruizhe1217_sft-s1-qwen-0.5b_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/ruizhe1217/sft-s1-qwen-0.5b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ruizhe1217/sft-s1-qwen-0.5b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ruizhe1217__sft-s1-qwen-0.5b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ruizhe1217/sft-s1-qwen-0.5b
2f8e051a801011cc906efe56c535aab5608aa341
9.240286
apache-2.0
0
0.494
true
false
false
false
0.535855
0.274875
27.487511
0.330054
8.276264
0.061934
6.193353
0.270973
2.796421
0.319583
0.78125
0.189162
9.906915
false
false
2025-02-26
2025-02-27
1
Qwen/Qwen2.5-0.5B
ontocord_wide_3b_sft_stage1.2-ss1-expert_math_float16
float16
🟢 pretrained
🟢
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/ontocord/wide_3b_sft_stage1.2-ss1-expert_math" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ontocord/wide_3b_sft_stage1.2-ss1-expert_math</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ontocord__wide_3b_sft_stage1.2-ss1-expert_math-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ontocord/wide_3b_sft_stage1.2-ss1-expert_math
92eb1ef051529df71a66f1c7841781dcf9cbd4e7
5.136739
0
3.759
false
false
false
false
0.545443
0.191519
19.15185
0.305958
3.166491
0.027946
2.794562
0.259228
1.230425
0.370031
3.453906
0.109209
1.023197
false
false
2025-03-06
2025-03-06
0
ontocord/wide_3b_sft_stage1.2-ss1-expert_math
JayHyeon_Qwen2.5-0.5B-SFT-5e-5-2ep_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/JayHyeon/Qwen2.5-0.5B-SFT-5e-5-2ep" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">JayHyeon/Qwen2.5-0.5B-SFT-5e-5-2ep</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/JayHyeon__Qwen2.5-0.5B-SFT-5e-5-2ep-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
JayHyeon/Qwen2.5-0.5B-SFT-5e-5-2ep
74c7d21925c30d90602995e9b33b6662fe547109
6.850187
0
0.63
false
false
false
false
1.615044
0.217472
21.747186
0.317988
5.832626
0.037764
3.776435
0.260067
1.342282
0.336792
1.432292
0.162733
6.970301
false
false
2024-12-29
2024-12-29
1
Qwen/Qwen2.5-0.5B
SkyOrbis_SKY-Ko-Qwen2.5-7B-Instruct-SFT-step-15000_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/SkyOrbis/SKY-Ko-Qwen2.5-7B-Instruct-SFT-step-15000" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">SkyOrbis/SKY-Ko-Qwen2.5-7B-Instruct-SFT-step-15000</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/SkyOrbis__SKY-Ko-Qwen2.5-7B-Instruct-SFT-step-15000-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
SkyOrbis/SKY-Ko-Qwen2.5-7B-Instruct-SFT-step-15000
4b2f6c40cc0b83c77d40805f23f300d90055641a
24.131333
0
7.616
false
false
false
false
1.286765
0.381887
38.188673
0.507796
31.327612
0.186556
18.655589
0.327181
10.290828
0.443604
13.950521
0.391373
32.374778
false
false
2025-01-31
2025-02-01
1
SkyOrbis/SKY-Ko-Qwen2.5-7B-Instruct-SFT-step-15000 (Merge)
ontocord_wide_3b_sft_stage1.2-ss1-expert_software_float16
float16
🟢 pretrained
🟢
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/ontocord/wide_3b_sft_stage1.2-ss1-expert_software" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ontocord/wide_3b_sft_stage1.2-ss1-expert_software</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ontocord__wide_3b_sft_stage1.2-ss1-expert_software-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ontocord/wide_3b_sft_stage1.2-ss1-expert_software
d3e034d69b18ca2ed506ff262c63ec8e1cf000bc
4.290233
0
3.759
false
false
false
false
0.277443
0.173383
17.338329
0.297996
2.499488
0.015861
1.586103
0.258389
1.118568
0.356854
1.640104
0.114029
1.558806
false
false
2025-03-05
2025-03-05
0
ontocord/wide_3b_sft_stage1.2-ss1-expert_software
JayHyeon_Qwen2.5-0.5B-SFT-1e-4-2ep_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/JayHyeon/Qwen2.5-0.5B-SFT-1e-4-2ep" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">JayHyeon/Qwen2.5-0.5B-SFT-1e-4-2ep</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/JayHyeon__Qwen2.5-0.5B-SFT-1e-4-2ep-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
JayHyeon/Qwen2.5-0.5B-SFT-1e-4-2ep
b183a552d7e26053a6fb0ee01191835d7735b80d
6.345325
0
0.63
false
false
false
false
1.580355
0.21405
21.404983
0.317223
5.650884
0.026435
2.643505
0.246644
0
0.347271
2.408854
0.153674
5.963726
false
false
2024-12-29
2024-12-29
1
Qwen/Qwen2.5-0.5B
JayHyeon_Qwen2.5-0.5B-SFT-1e-5-2ep_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/JayHyeon/Qwen2.5-0.5B-SFT-1e-5-2ep" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">JayHyeon/Qwen2.5-0.5B-SFT-1e-5-2ep</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/JayHyeon__Qwen2.5-0.5B-SFT-1e-5-2ep-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
JayHyeon/Qwen2.5-0.5B-SFT-1e-5-2ep
27b7325a3e7b629ded08040ba017a6c63e3be68a
7.003092
0
0.63
false
false
false
false
1.694762
0.197064
19.706379
0.32247
5.619297
0.05287
5.287009
0.269295
2.572707
0.33676
1.595052
0.165143
7.238106
false
false
2024-12-29
2024-12-29
1
Qwen/Qwen2.5-0.5B