eval_name
stringlengths
12
111
Precision
stringclasses
3 values
Type
stringclasses
7 values
T
stringclasses
7 values
Weight type
stringclasses
2 values
Architecture
stringclasses
64 values
Model
stringlengths
355
689
fullname
stringlengths
4
102
Model sha
stringlengths
0
40
Average ⬆️
float64
0.74
52.1
Hub License
stringclasses
27 values
Hub ❤️
int64
0
6.09k
#Params (B)
float64
-1
141
Available on the hub
bool
2 classes
MoE
bool
2 classes
Flagged
bool
2 classes
Chat Template
bool
2 classes
CO₂ cost (kg)
float64
0.04
187
IFEval Raw
float64
0
0.9
IFEval
float64
0
90
BBH Raw
float64
0.22
0.83
BBH
float64
0.25
76.7
MATH Lvl 5 Raw
float64
0
0.71
MATH Lvl 5
float64
0
71.5
GPQA Raw
float64
0.21
0.47
GPQA
float64
0
29.4
MUSR Raw
float64
0.29
0.6
MUSR
float64
0
38.7
MMLU-PRO Raw
float64
0.1
0.73
MMLU-PRO
float64
0
70
Merged
bool
2 classes
Official Providers
bool
2 classes
Upload To Hub Date
stringclasses
525 values
Submission Date
stringclasses
263 values
Generation
int64
0
10
Base Model
stringlengths
4
102
viettelsecurity-ai_security-llama3.2-3b_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/viettelsecurity-ai/security-llama3.2-3b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">viettelsecurity-ai/security-llama3.2-3b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/viettelsecurity-ai__security-llama3.2-3b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
viettelsecurity-ai/security-llama3.2-3b
a33cd2c208d3cefef12601f7dc9a290a218fafa3
19.977384
0
3.213
false
false
false
true
0.606916
0.590889
59.088884
0.440058
20.597406
0.126133
12.613293
0.274329
3.243848
0.337906
3.904948
0.283743
20.415928
false
false
2025-03-03
2025-03-04
1
viettelsecurity-ai/security-llama3.2-3b (Merge)
vihangd_smart-dan-sft-v0.1_4bit
4bit
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/vihangd/smart-dan-sft-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">vihangd/smart-dan-sft-v0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/vihangd__smart-dan-sft-v0.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
vihangd/smart-dan-sft-v0.1
924b4a09153d4061fa9d58f03b10cd7cde7e3084
3.871213
apache-2.0
0
0.379
true
false
false
false
0.722049
0.157646
15.764616
0.306177
3.125599
0.009819
0.981873
0.255034
0.671141
0.350188
1.106771
0.114195
1.577275
false
false
2024-08-09
2024-08-20
0
vihangd/smart-dan-sft-v0.1
voidful_smol-360m-ft_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/voidful/smol-360m-ft" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">voidful/smol-360m-ft</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/voidful__smol-360m-ft-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
voidful/smol-360m-ft
3889a38fc79d2400997e01bf1d00c8059d72fead
4.78993
apache-2.0
0
0.362
true
false
false
true
0.763459
0.20131
20.13103
0.301195
3.022706
0.008308
0.830816
0.245805
0
0.371365
3.78724
0.10871
0.96779
false
false
2024-11-24
2024-11-28
1
voidful/smol-360m-ft (Merge)
vonjack_MobileLLM-125M-HF_float16
float16
🟢 pretrained
🟢
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/vonjack/MobileLLM-125M-HF" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">vonjack/MobileLLM-125M-HF</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/vonjack__MobileLLM-125M-HF-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
vonjack/MobileLLM-125M-HF
7664f5e1b91faa04fac545f64db84c26316c7e63
5.565352
cc-by-nc-4.0
0
0.125
true
false
false
false
0.343623
0.210728
21.072754
0.30273
3.146584
0.009063
0.906344
0.260067
1.342282
0.378187
5.106771
0.116356
1.817376
false
false
2024-11-15
2024-11-15
0
vonjack/MobileLLM-125M-HF
vonjack_Phi-3-mini-4k-instruct-LLaMAfied_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/vonjack/Phi-3-mini-4k-instruct-LLaMAfied" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">vonjack/Phi-3-mini-4k-instruct-LLaMAfied</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/vonjack__Phi-3-mini-4k-instruct-LLaMAfied-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
vonjack/Phi-3-mini-4k-instruct-LLaMAfied
96a48b8ea6f661f71ade001a0a2232b66ac38481
26.96808
mit
11
3.821
true
false
false
true
0.90223
0.578749
57.874883
0.574068
40.201852
0.138218
13.821752
0.330537
10.738255
0.392354
7.110938
0.388547
32.060801
false
false
2024-04-24
2025-01-03
0
vonjack/Phi-3-mini-4k-instruct-LLaMAfied
vonjack_Phi-3.5-mini-instruct-hermes-fc-json_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Adapter
?
<a target="_blank" href="https://huggingface.co/vonjack/Phi-3.5-mini-instruct-hermes-fc-json" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">vonjack/Phi-3.5-mini-instruct-hermes-fc-json</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/vonjack__Phi-3.5-mini-instruct-hermes-fc-json-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
vonjack/Phi-3.5-mini-instruct-hermes-fc-json
4cacfb35723647d408f0414886d0dfe67404a14f
4.642406
apache-2.0
1
4.132
true
false
false
true
2.570377
0.141584
14.158433
0.297476
2.390836
0.007553
0.755287
0.254195
0.559284
0.404135
8.45026
0.113863
1.540337
false
false
2024-11-05
2024-11-05
1
vonjack/Phi-3.5-mini-instruct-hermes-fc-json (Merge)
vonjack_Qwen2.5-Coder-0.5B-Merged_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/vonjack/Qwen2.5-Coder-0.5B-Merged" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">vonjack/Qwen2.5-Coder-0.5B-Merged</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/vonjack__Qwen2.5-Coder-0.5B-Merged-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
vonjack/Qwen2.5-Coder-0.5B-Merged
38e4789c0fc5fad359de2f7bafdb65c3ae26b95b
6.979693
0
0.63
false
false
false
true
0.993557
0.309971
30.997088
0.307602
3.588738
0.037764
3.776435
0.253356
0.447427
0.330344
0.826302
0.12018
2.242169
false
false
2024-11-19
2024-11-19
1
vonjack/Qwen2.5-Coder-0.5B-Merged (Merge)
vonjack_SmolLM2-1.7B-Merged_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/vonjack/SmolLM2-1.7B-Merged" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">vonjack/SmolLM2-1.7B-Merged</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/vonjack__SmolLM2-1.7B-Merged-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
vonjack/SmolLM2-1.7B-Merged
232d54a335220b0d83d6036f6d8df3971d3e79bb
12.23423
0
1.711
false
false
false
true
0.622655
0.369797
36.979658
0.358655
10.76653
0.062689
6.268882
0.279362
3.914989
0.340792
3.832292
0.204787
11.643026
false
false
2024-11-18
2024-11-18
1
vonjack/SmolLM2-1.7B-Merged (Merge)
vonjack_SmolLM2-135M-Merged_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/vonjack/SmolLM2-135M-Merged" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">vonjack/SmolLM2-135M-Merged</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/vonjack__SmolLM2-135M-Merged-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
vonjack/SmolLM2-135M-Merged
a1700ca913a87ad713edfe57a2030a9d7c088970
5.87243
0
0.135
false
false
false
true
0.691021
0.248297
24.829674
0.309993
4.587041
0.011329
1.132931
0.238255
0
0.366187
3.440104
0.111203
1.244829
false
false
2024-11-15
2024-11-15
1
vonjack/SmolLM2-135M-Merged (Merge)
vonjack_SmolLM2-360M-Merged_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/vonjack/SmolLM2-360M-Merged" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">vonjack/SmolLM2-360M-Merged</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/vonjack__SmolLM2-360M-Merged-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
vonjack/SmolLM2-360M-Merged
32bceedf56b29a4a9fdd459a36fbc7fae5e274c8
7.294377
0
0.362
false
false
false
true
0.771484
0.320587
32.058715
0.315485
4.741734
0.017372
1.73716
0.255872
0.782998
0.352729
3.357813
0.109791
1.08784
false
false
2024-11-15
2024-11-15
1
vonjack/SmolLM2-360M-Merged (Merge)
w4r10ck_SOLAR-10.7B-Instruct-v1.0-uncensored_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/w4r10ck/SOLAR-10.7B-Instruct-v1.0-uncensored" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">w4r10ck/SOLAR-10.7B-Instruct-v1.0-uncensored</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/w4r10ck__SOLAR-10.7B-Instruct-v1.0-uncensored-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
w4r10ck/SOLAR-10.7B-Instruct-v1.0-uncensored
baa7b3899e85af4b2f02b01fd93f203872140d27
21.621994
apache-2.0
36
10.732
true
false
false
false
1.603942
0.388406
38.84061
0.530153
33.858639
0.06571
6.570997
0.294463
5.928412
0.463948
18.49349
0.334358
26.03982
false
false
2023-12-14
2024-10-11
0
w4r10ck/SOLAR-10.7B-Instruct-v1.0-uncensored
wanlige_li-14b-v0.4_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/wanlige/li-14b-v0.4" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">wanlige/li-14b-v0.4</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/wanlige__li-14b-v0.4-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
wanlige/li-14b-v0.4
28003038d56fc3a65f3d807e8c4a527b437075dc
43.659962
15
14.77
false
false
false
true
4.999545
0.81328
81.327988
0.654446
50.384177
0.557402
55.740181
0.338926
11.856823
0.446
16.35
0.516705
46.300606
false
false
2025-02-22
2025-02-26
1
wanlige/li-14b-v0.4 (Merge)
wanlige_li-14b-v0.4-slerp_float16
float16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/wanlige/li-14b-v0.4-slerp" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">wanlige/li-14b-v0.4-slerp</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/wanlige__li-14b-v0.4-slerp-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
wanlige/li-14b-v0.4-slerp
7ce44a61559fd66cb2eeace825f22321fc9ce269
37.792625
6
14.766
false
false
false
false
1.988583
0.460597
46.059677
0.658718
51.046628
0.419184
41.918429
0.400168
20.022371
0.47675
19.127083
0.537234
48.58156
false
false
2025-02-24
2025-02-26
1
wanlige/li-14b-v0.4-slerp (Merge)
wanlige_li-14b-v0.4-slerp0.1_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/wanlige/li-14b-v0.4-slerp0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">wanlige/li-14b-v0.4-slerp0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/wanlige__li-14b-v0.4-slerp0.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
wanlige/li-14b-v0.4-slerp0.1
7ce44a61559fd66cb2eeace825f22321fc9ce269
42.906109
6
14.766
false
false
false
true
1.594935
0.792272
79.227228
0.657174
50.881273
0.533233
53.323263
0.35906
14.541387
0.420667
11.75
0.529422
47.713505
false
false
2025-02-24
2025-02-26
1
wanlige/li-14b-v0.4-slerp0.1 (Merge)
wannaphong_KhanomTanLLM-Instruct_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/wannaphong/KhanomTanLLM-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">wannaphong/KhanomTanLLM-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/wannaphong__KhanomTanLLM-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
wannaphong/KhanomTanLLM-Instruct
351239c92c0ff3304d1dd98fdf4ac054a8c1acc3
4.819284
apache-2.0
3
3.447
true
false
false
true
0.803461
0.162118
16.211763
0.309312
3.944866
0.013595
1.359517
0.263423
1.789709
0.370062
4.291146
0.111868
1.318706
false
false
2024-08-24
2024-08-29
0
wannaphong/KhanomTanLLM-Instruct
waqasali1707_Beast-Soul-new_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/waqasali1707/Beast-Soul-new" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">waqasali1707/Beast-Soul-new</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/waqasali1707__Beast-Soul-new-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
waqasali1707/Beast-Soul-new
a23d68c4556d91a129de3f8fd8b9e0ff0890f4cc
22.108388
0
7.242
false
false
false
false
1.273776
0.502987
50.298652
0.522495
33.044262
0.070242
7.024169
0.282718
4.362416
0.448563
14.503646
0.310755
23.417184
false
false
2024-08-07
2024-08-07
1
waqasali1707/Beast-Soul-new (Merge)
wave-on-discord_qwent-7b_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/wave-on-discord/qwent-7b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">wave-on-discord/qwent-7b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/wave-on-discord__qwent-7b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
wave-on-discord/qwent-7b
40000e76d2a4d0ad054aff9fe873c5beb0e4925e
8.797033
0
7.616
false
false
false
false
2.646992
0.201485
20.148539
0.42281
18.066398
0.003776
0.377644
0.265101
2.013423
0.381656
5.473698
0.160322
6.702497
false
false
2024-09-30
2024-09-30
1
wave-on-discord/qwent-7b (Merge)
weathermanj_Menda-3B-500_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/weathermanj/Menda-3B-500" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">weathermanj/Menda-3B-500</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/weathermanj__Menda-3B-500-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
weathermanj/Menda-3B-500
aff308a2ed453aa67e059bdf16a9eba2c72f2497
27.91006
other
0
3.086
true
false
false
true
0.729166
0.635302
63.530211
0.476631
26.596425
0.372356
37.23565
0.287752
5.033557
0.396792
7.565625
0.34749
27.498892
false
false
2025-03-10
2025-03-10
0
weathermanj/Menda-3B-500
weathermanj_Menda-3b-750_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/weathermanj/Menda-3b-750" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">weathermanj/Menda-3b-750</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/weathermanj__Menda-3b-750-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
weathermanj/Menda-3b-750
0aeddae1b5f658ff21023e134438c030e90955de
27.833449
other
1
3.086
true
false
false
true
0.740785
0.633504
63.350355
0.473683
26.375984
0.371601
37.160121
0.287752
5.033557
0.394187
7.240104
0.350565
27.840573
false
false
2025-03-09
2025-03-09
0
weathermanj/Menda-3b-750
weathermanj_Menda-3b-Optim-100_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/weathermanj/Menda-3b-Optim-100" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">weathermanj/Menda-3b-Optim-100</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/weathermanj__Menda-3b-Optim-100-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
weathermanj/Menda-3b-Optim-100
fdf027643cafe4c8b88368928eab13a366a4c546
27.957516
other
0
3.086
true
false
false
true
0.729069
0.639823
63.982345
0.47348
26.024032
0.371601
37.160121
0.28943
5.257271
0.399302
7.979427
0.346077
27.341903
false
false
2025-03-10
2025-03-10
0
weathermanj/Menda-3b-Optim-100
weathermanj_Menda-3b-Optim-200_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/weathermanj/Menda-3b-Optim-200" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">weathermanj/Menda-3b-Optim-200</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/weathermanj__Menda-3b-Optim-200-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
weathermanj/Menda-3b-Optim-200
3328a697d8dc4d1421119475f5c56bd3ede751d4
27.967747
other
0
3.086
true
false
false
true
0.694797
0.637475
63.747523
0.474606
26.072131
0.373112
37.311178
0.282718
4.362416
0.403302
8.71276
0.348404
27.600473
false
false
2025-03-10
2025-03-10
0
weathermanj/Menda-3b-Optim-200
win10_ArliAI-RPMax-v1.3-merge-13.3B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/win10/ArliAI-RPMax-v1.3-merge-13.3B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">win10/ArliAI-RPMax-v1.3-merge-13.3B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/win10__ArliAI-RPMax-v1.3-merge-13.3B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
win10/ArliAI-RPMax-v1.3-merge-13.3B
4d3ed351827f1afc1652e13aafeb1eae79b8f562
16.53163
0
13.265
false
false
false
true
2.90261
0.303826
30.382607
0.458139
23.0298
0.039275
3.927492
0.274329
3.243848
0.43251
14.163802
0.31998
24.442228
false
false
2024-11-16
2024-11-17
1
win10/ArliAI-RPMax-v1.3-merge-13.3B (Merge)
win10_Breeze-13B-32k-Instruct-v1_0_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/win10/Breeze-13B-32k-Instruct-v1_0" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">win10/Breeze-13B-32k-Instruct-v1_0</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/win10__Breeze-13B-32k-Instruct-v1_0-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
win10/Breeze-13B-32k-Instruct-v1_0
220c957cf5d9c534a4ef75c11a18221c461de40a
15.461558
apache-2.0
0
12.726
true
false
false
true
2.897622
0.358431
35.843118
0.461123
25.258699
0.01284
1.283988
0.264262
1.901566
0.420198
11.058073
0.256815
17.423907
true
false
2024-06-26
2024-06-26
0
win10/Breeze-13B-32k-Instruct-v1_0
win10_EVA-Norns-Qwen2.5-v0.1_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/win10/EVA-Norns-Qwen2.5-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">win10/EVA-Norns-Qwen2.5-v0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/win10__EVA-Norns-Qwen2.5-v0.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
win10/EVA-Norns-Qwen2.5-v0.1
90c3ca66e700b4a7d2c509634f9b9748a2e4c3ab
26.432797
1
7.616
false
false
false
true
1.313322
0.621963
62.196306
0.507241
30.060942
0.261329
26.132931
0.285235
4.697987
0.40451
8.563802
0.342503
26.944814
false
false
2024-11-17
2024-11-18
1
win10/EVA-Norns-Qwen2.5-v0.1 (Merge)
win10_Llama-3.2-3B-Instruct-24-9-29_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/win10/Llama-3.2-3B-Instruct-24-9-29" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">win10/Llama-3.2-3B-Instruct-24-9-29</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/win10__Llama-3.2-3B-Instruct-24-9-29-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
win10/Llama-3.2-3B-Instruct-24-9-29
4defb10e2415111abb873d695dd40c387c1d6d57
24.004698
llama3.2
0
3.213
true
false
false
true
1.427211
0.733221
73.322119
0.461423
24.196426
0.170695
17.069486
0.274329
3.243848
0.355521
1.440104
0.322806
24.756206
false
false
2024-09-29
2024-10-11
2
meta-llama/Llama-3.2-3B-Instruct
win10_Norns-Qwen2.5-12B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/win10/Norns-Qwen2.5-12B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">win10/Norns-Qwen2.5-12B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/win10__Norns-Qwen2.5-12B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
win10/Norns-Qwen2.5-12B
464793295c8633a95e6faedad24dfa8f0fd35663
17.708128
1
12.277
false
false
false
true
3.245944
0.489697
48.969734
0.461892
23.769257
0.083837
8.383686
0.283557
4.474273
0.35549
2.202865
0.266041
18.448951
false
false
2024-11-17
2024-11-17
1
win10/Norns-Qwen2.5-12B (Merge)
win10_Norns-Qwen2.5-7B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/win10/Norns-Qwen2.5-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">win10/Norns-Qwen2.5-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/win10__Norns-Qwen2.5-7B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
win10/Norns-Qwen2.5-7B
148d9156f734a8050812892879cf13d1ca01f137
26.38079
0
7.616
false
false
false
true
1.299827
0.612221
61.222113
0.507289
30.250415
0.26284
26.283988
0.284396
4.58613
0.408479
9.126563
0.34134
26.815529
false
false
2024-11-17
2024-11-18
1
win10/Norns-Qwen2.5-7B (Merge)
win10_Qwen2.5-2B-Instruct_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/win10/Qwen2.5-2B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">win10/Qwen2.5-2B-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/win10__Qwen2.5-2B-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
win10/Qwen2.5-2B-Instruct
6cc7fca3447d50772978d2d7dec255abdc73d54b
10.570677
1
2.9
false
false
false
false
2.052183
0.227289
22.728915
0.370591
12.071946
0.022659
2.265861
0.267617
2.348993
0.437844
13.630469
0.193401
10.377881
false
false
2024-10-11
2024-12-20
1
win10/Qwen2.5-2B-Instruct (Merge)
win10_llama3-13.45b-Instruct_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/win10/llama3-13.45b-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">win10/llama3-13.45b-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/win10__llama3-13.45b-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
win10/llama3-13.45b-Instruct
94cc0f415e355c6d3d47168a6ff5239ca586904a
17.340222
llama3
1
13.265
true
false
false
true
4.273069
0.414435
41.443481
0.486542
26.67569
0.024169
2.416918
0.258389
1.118568
0.38476
6.328385
0.334525
26.058289
true
false
2024-06-09
2024-06-26
1
win10/llama3-13.45b-Instruct (Merge)
win10_miscii-14b-1M-0128_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/win10/miscii-14b-1M-0128" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">win10/miscii-14b-1M-0128</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/win10__miscii-14b-1M-0128-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
win10/miscii-14b-1M-0128
dfe9d4fbb26228489f18691f045ac9ef309dc3bd
35.339596
apache-2.0
2
14.766
true
false
false
false
3.776966
0.418082
41.80818
0.574199
37.274344
0.477341
47.734139
0.38255
17.673378
0.543104
28.754688
0.449136
38.792849
true
false
2025-01-28
2025-01-30
1
win10/miscii-14b-1M-0128 (Merge)
winglian_Llama-3-8b-64k-PoSE_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/winglian/Llama-3-8b-64k-PoSE" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">winglian/Llama-3-8b-64k-PoSE</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/winglian__Llama-3-8b-64k-PoSE-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
winglian/Llama-3-8b-64k-PoSE
5481d9b74a3ec5a95789673e194c8ff86e2bc2bc
11.143207
75
8.03
false
false
false
true
1.822042
0.285691
28.569086
0.370218
13.307317
0.041541
4.154079
0.260906
1.454139
0.339552
3.077344
0.246676
16.297281
false
false
2024-04-24
2024-06-26
0
winglian/Llama-3-8b-64k-PoSE
winglian_llama-3-8b-256k-PoSE_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/winglian/llama-3-8b-256k-PoSE" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">winglian/llama-3-8b-256k-PoSE</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/winglian__llama-3-8b-256k-PoSE-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
winglian/llama-3-8b-256k-PoSE
93e7b0b6433c96583ffcef3bc47203e6fdcbbe8b
6.633244
42
8.03
false
false
false
true
2.101446
0.290911
29.091145
0.315658
5.502849
0.019637
1.963746
0.25755
1.006711
0.331552
0.94401
0.111619
1.291002
false
false
2024-04-26
2024-06-26
0
winglian/llama-3-8b-256k-PoSE
wzhouad_gemma-2-9b-it-WPO-HB_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/wzhouad/gemma-2-9b-it-WPO-HB" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">wzhouad/gemma-2-9b-it-WPO-HB</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/wzhouad__gemma-2-9b-it-WPO-HB-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
wzhouad/gemma-2-9b-it-WPO-HB
5934cb2faf589341e96e2e79aec82b2d4b7be252
24.977569
34
9.242
false
false
false
true
5.179895
0.543703
54.370293
0.562862
36.661696
0.153323
15.332326
0.349832
13.310962
0.367458
3.965625
0.336021
26.224512
false
false
2024-08-08
2025-01-07
2
google/gemma-2-9b
x0000001_Deepseek-Lumen-R1-Qwen2.5-14B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/x0000001/Deepseek-Lumen-R1-Qwen2.5-14B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">x0000001/Deepseek-Lumen-R1-Qwen2.5-14B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/x0000001__Deepseek-Lumen-R1-Qwen2.5-14B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
x0000001/Deepseek-Lumen-R1-Qwen2.5-14B
a5dfd03848e2d1accf4e3de52fa565d27f4bcf99
26.028525
apache-2.0
0
14.77
true
false
false
true
4.474911
0.443611
44.361073
0.456905
22.72526
0.277946
27.794562
0.285235
4.697987
0.473969
19.046094
0.437916
37.546173
true
false
2025-01-28
2025-01-29
1
x0000001/Deepseek-Lumen-R1-Qwen2.5-14B (Merge)
xMaulana_FinMatcha-3B-Instruct_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/xMaulana/FinMatcha-3B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">xMaulana/FinMatcha-3B-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/xMaulana__FinMatcha-3B-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
xMaulana/FinMatcha-3B-Instruct
be2c0c04fc4dc3fb93631e3c663721da92fea8fc
24.142124
apache-2.0
0
3.213
true
false
false
true
7.313148
0.754828
75.48283
0.453555
23.191023
0.143505
14.350453
0.269295
2.572707
0.363333
5.016667
0.318152
24.239066
false
false
2024-09-29
2024-10-22
1
xMaulana/FinMatcha-3B-Instruct (Merge)
xinchen9_Llama3.1_8B_Instruct_CoT_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/xinchen9/Llama3.1_8B_Instruct_CoT" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">xinchen9/Llama3.1_8B_Instruct_CoT</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/xinchen9__Llama3.1_8B_Instruct_CoT-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
xinchen9/Llama3.1_8B_Instruct_CoT
cab1b33ddff08de11c5daea8ae079d126d503d8b
16.316625
apache-2.0
0
8.03
true
false
false
false
2.8042
0.297357
29.735657
0.439821
21.142866
0.060423
6.042296
0.302013
6.935123
0.437062
13.166146
0.287899
20.87766
false
false
2024-09-16
2024-09-19
0
xinchen9/Llama3.1_8B_Instruct_CoT
xinchen9_Llama3.1_CoT_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/xinchen9/Llama3.1_CoT" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">xinchen9/Llama3.1_CoT</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/xinchen9__Llama3.1_CoT-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
xinchen9/Llama3.1_CoT
3cb467f51a59ff163bb942fcde3ef60573c12b79
13.741515
apache-2.0
0
8.03
true
false
false
true
1.900198
0.224616
22.461624
0.434101
19.899124
0.03852
3.851964
0.288591
5.145414
0.430458
11.773958
0.273853
19.317007
false
false
2024-09-04
2024-09-06
0
xinchen9/Llama3.1_CoT
xinchen9_Llama3.1_CoT_V1_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/xinchen9/Llama3.1_CoT_V1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">xinchen9/Llama3.1_CoT_V1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/xinchen9__Llama3.1_CoT_V1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
xinchen9/Llama3.1_CoT_V1
c5ed4b8bfc364ebae1843af14799818551f5251f
14.734826
apache-2.0
0
8.03
true
false
false
false
2.821183
0.245299
24.529914
0.4376
20.166003
0.033233
3.323263
0.279362
3.914989
0.457219
16.41901
0.280502
20.055777
false
false
2024-09-06
2024-09-07
0
xinchen9/Llama3.1_CoT_V1
xinchen9_Mistral-7B-CoT_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/xinchen9/Mistral-7B-CoT" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">xinchen9/Mistral-7B-CoT</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/xinchen9__Mistral-7B-CoT-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
xinchen9/Mistral-7B-CoT
9a3c8103dac20d5497d1b8fc041bb5125ff4dc00
11.265676
apache-2.0
0
7.242
true
false
false
false
2.515922
0.278347
27.834701
0.387268
14.806193
0.024924
2.492447
0.249161
0
0.399427
8.195052
0.228391
14.265662
false
false
2024-09-09
2024-09-23
0
xinchen9/Mistral-7B-CoT
xinchen9_llama3-b8-ft-dis_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/xinchen9/llama3-b8-ft-dis" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">xinchen9/llama3-b8-ft-dis</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/xinchen9__llama3-b8-ft-dis-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
xinchen9/llama3-b8-ft-dis
e4da730f28f79543262de37908943c35f8df81fe
13.973492
apache-2.0
0
8.03
true
false
false
false
2.124654
0.154599
15.459869
0.462579
24.727457
0.039275
3.927492
0.312919
8.389262
0.365375
6.405208
0.324385
24.931664
false
false
2024-06-28
2024-07-11
0
xinchen9/llama3-b8-ft-dis
xkp24_Llama-3-8B-Instruct-SPPO-Iter2_bt_2b-table_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/xkp24/Llama-3-8B-Instruct-SPPO-Iter2_bt_2b-table" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">xkp24/Llama-3-8B-Instruct-SPPO-Iter2_bt_2b-table</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/xkp24__Llama-3-8B-Instruct-SPPO-Iter2_bt_2b-table-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
xkp24/Llama-3-8B-Instruct-SPPO-Iter2_bt_2b-table
c083d6796f54f66b4cec2261657a02801c761093
22.823849
0
8.03
false
false
false
true
1.248461
0.637475
63.747523
0.491227
27.422821
0.092145
9.214502
0.259228
1.230425
0.382
5.483333
0.3686
29.844489
false
false
2024-09-30
2024-10-01
0
xkp24/Llama-3-8B-Instruct-SPPO-Iter2_bt_2b-table
xkp24_Llama-3-8B-Instruct-SPPO-Iter2_bt_8b-table_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/xkp24/Llama-3-8B-Instruct-SPPO-Iter2_bt_8b-table" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">xkp24/Llama-3-8B-Instruct-SPPO-Iter2_bt_8b-table</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/xkp24__Llama-3-8B-Instruct-SPPO-Iter2_bt_8b-table-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
xkp24/Llama-3-8B-Instruct-SPPO-Iter2_bt_8b-table
5416d34b5243559914a377ee9d95ce4830bf8dba
24.502405
0
8.03
false
false
false
true
1.500527
0.727451
72.745094
0.505686
29.398353
0.084592
8.459215
0.260067
1.342282
0.381906
5.104948
0.369681
29.964539
false
false
2024-09-30
2024-10-01
0
xkp24/Llama-3-8B-Instruct-SPPO-Iter2_bt_8b-table
xkp24_Llama-3-8B-Instruct-SPPO-Iter2_gp_2b-table_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/xkp24/Llama-3-8B-Instruct-SPPO-Iter2_gp_2b-table" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">xkp24/Llama-3-8B-Instruct-SPPO-Iter2_gp_2b-table</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/xkp24__Llama-3-8B-Instruct-SPPO-Iter2_gp_2b-table-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
xkp24/Llama-3-8B-Instruct-SPPO-Iter2_gp_2b-table
235204157d7fac0d64fa609d5aee3cebb49ccd11
22.639173
0
8.03
false
false
false
true
1.343483
0.656859
65.685936
0.495183
27.6952
0.089124
8.912387
0.259228
1.230425
0.359396
2.291146
0.37018
30.019947
false
false
2024-09-30
2024-09-30
0
xkp24/Llama-3-8B-Instruct-SPPO-Iter2_gp_2b-table
xkp24_Llama-3-8B-Instruct-SPPO-Iter2_gp_8b-table_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/xkp24/Llama-3-8B-Instruct-SPPO-Iter2_gp_8b-table" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">xkp24/Llama-3-8B-Instruct-SPPO-Iter2_gp_8b-table</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/xkp24__Llama-3-8B-Instruct-SPPO-Iter2_gp_8b-table-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
xkp24/Llama-3-8B-Instruct-SPPO-Iter2_gp_8b-table
9db00cbbba84453b18956fcc76f264f94a205955
23.073735
0
8.03
false
false
false
true
1.438457
0.66208
66.207995
0.500449
28.508587
0.086103
8.610272
0.259228
1.230425
0.380542
5.001042
0.359957
28.884087
false
false
2024-09-30
2024-09-30
0
xkp24/Llama-3-8B-Instruct-SPPO-Iter2_gp_8b-table
xkp24_Llama-3-8B-Instruct-SPPO-score-Iter2_bt_2b-table-0.001_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/xkp24/Llama-3-8B-Instruct-SPPO-score-Iter2_bt_2b-table-0.001" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">xkp24/Llama-3-8B-Instruct-SPPO-score-Iter2_bt_2b-table-0.001</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/xkp24__Llama-3-8B-Instruct-SPPO-score-Iter2_bt_2b-table-0.001-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
xkp24/Llama-3-8B-Instruct-SPPO-score-Iter2_bt_2b-table-0.001
1062757826de031a4ae82277e6e737e19e82e514
22.424535
0
8.03
false
false
false
true
1.230005
0.604228
60.422789
0.493606
27.613714
0.099698
9.969789
0.259228
1.230425
0.379333
5.216667
0.370844
30.093824
false
false
2024-09-30
2024-10-01
0
xkp24/Llama-3-8B-Instruct-SPPO-score-Iter2_bt_2b-table-0.001
xkp24_Llama-3-8B-Instruct-SPPO-score-Iter2_bt_8b-table-0.002_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/xkp24/Llama-3-8B-Instruct-SPPO-score-Iter2_bt_8b-table-0.002" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">xkp24/Llama-3-8B-Instruct-SPPO-score-Iter2_bt_8b-table-0.002</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/xkp24__Llama-3-8B-Instruct-SPPO-score-Iter2_bt_8b-table-0.002-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
xkp24/Llama-3-8B-Instruct-SPPO-score-Iter2_bt_8b-table-0.002
e5d2f179b4a7bd851dcf2b7db6358b13001bf1af
24.203176
0
8.03
false
false
false
true
1.682937
0.713188
71.318768
0.499638
28.574879
0.085347
8.534743
0.258389
1.118568
0.387208
6.067708
0.366439
29.604388
false
false
2024-09-30
2024-10-01
0
xkp24/Llama-3-8B-Instruct-SPPO-score-Iter2_bt_8b-table-0.002
xkp24_Llama-3-8B-Instruct-SPPO-score-Iter2_gp_2b-table-0.001_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/xkp24/Llama-3-8B-Instruct-SPPO-score-Iter2_gp_2b-table-0.001" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">xkp24/Llama-3-8B-Instruct-SPPO-score-Iter2_gp_2b-table-0.001</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/xkp24__Llama-3-8B-Instruct-SPPO-score-Iter2_gp_2b-table-0.001-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
xkp24/Llama-3-8B-Instruct-SPPO-score-Iter2_gp_2b-table-0.001
0e319ad47ed2b2636b72d07ee9b32657e1e50412
21.791089
0
8.03
false
false
false
true
1.359683
0.594711
59.471092
0.489922
26.943904
0.107251
10.725076
0.259228
1.230425
0.358094
2.328385
0.370429
30.047651
false
false
2024-09-30
2024-09-30
0
xkp24/Llama-3-8B-Instruct-SPPO-score-Iter2_gp_2b-table-0.001
xkp24_Llama-3-8B-Instruct-SPPO-score-Iter2_gp_8b-table-0.002_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/xkp24/Llama-3-8B-Instruct-SPPO-score-Iter2_gp_8b-table-0.002" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">xkp24/Llama-3-8B-Instruct-SPPO-score-Iter2_gp_8b-table-0.002</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/xkp24__Llama-3-8B-Instruct-SPPO-score-Iter2_gp_8b-table-0.002-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
xkp24/Llama-3-8B-Instruct-SPPO-score-Iter2_gp_8b-table-0.002
0877f2458ea667edcf9213383df41294c788190f
23.121576
0
8.03
false
false
false
true
1.538239
0.645319
64.531887
0.495108
28.046978
0.093656
9.365559
0.260067
1.342282
0.393875
7.334375
0.352975
28.108378
false
false
2024-09-30
2024-10-01
0
xkp24/Llama-3-8B-Instruct-SPPO-score-Iter2_gp_8b-table-0.002
xukp20_Llama-3-8B-Instruct-SPPO-Iter3_bt_2b-table_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/xukp20/Llama-3-8B-Instruct-SPPO-Iter3_bt_2b-table" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">xukp20/Llama-3-8B-Instruct-SPPO-Iter3_bt_2b-table</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/xukp20__Llama-3-8B-Instruct-SPPO-Iter3_bt_2b-table-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
xukp20/Llama-3-8B-Instruct-SPPO-Iter3_bt_2b-table
d2b87100e5ba3215fddbd308bb17b7bf12fe6c9e
21.357659
0
8.03
false
false
false
true
1.972861
0.575602
57.560163
0.490121
26.866404
0.099698
9.969789
0.259228
1.230425
0.365969
2.979427
0.365858
29.539746
false
false
2024-09-28
2024-09-29
0
xukp20/Llama-3-8B-Instruct-SPPO-Iter3_bt_2b-table
xukp20_Llama-3-8B-Instruct-SPPO-Iter3_bt_8b-table_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/xukp20/Llama-3-8B-Instruct-SPPO-Iter3_bt_8b-table" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">xukp20/Llama-3-8B-Instruct-SPPO-Iter3_bt_8b-table</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/xukp20__Llama-3-8B-Instruct-SPPO-Iter3_bt_8b-table-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
xukp20/Llama-3-8B-Instruct-SPPO-Iter3_bt_8b-table
19a48ccf5ea463afbbbc61d650b8fb63ff2d94c7
24.132871
0
8.03
false
false
false
true
1.180306
0.703446
70.344575
0.509187
29.731239
0.096677
9.667674
0.259228
1.230425
0.373906
3.904948
0.369265
29.918366
false
false
2024-09-28
2024-09-29
0
xukp20/Llama-3-8B-Instruct-SPPO-Iter3_bt_8b-table
xukp20_Llama-3-8B-Instruct-SPPO-Iter3_gp_2b-table_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/xukp20/Llama-3-8B-Instruct-SPPO-Iter3_gp_2b-table" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">xukp20/Llama-3-8B-Instruct-SPPO-Iter3_gp_2b-table</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/xukp20__Llama-3-8B-Instruct-SPPO-Iter3_gp_2b-table-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
xukp20/Llama-3-8B-Instruct-SPPO-Iter3_gp_2b-table
0fe230b3432fb2b0f89942d7926291a4dbeb2820
22.083581
0
8.03
false
false
false
true
1.331043
0.602379
60.237946
0.496953
27.892403
0.10423
10.422961
0.259228
1.230425
0.367365
3.18724
0.365775
29.530511
false
false
2024-09-28
2024-09-29
0
xukp20/Llama-3-8B-Instruct-SPPO-Iter3_gp_2b-table
xukp20_Llama-3-8B-Instruct-SPPO-Iter3_gp_8b-table_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/xukp20/Llama-3-8B-Instruct-SPPO-Iter3_gp_8b-table" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">xukp20/Llama-3-8B-Instruct-SPPO-Iter3_gp_8b-table</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/xukp20__Llama-3-8B-Instruct-SPPO-Iter3_gp_8b-table-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
xukp20/Llama-3-8B-Instruct-SPPO-Iter3_gp_8b-table
d1e19da1029f2d4d45de015754bc52dcb1ea5570
23.235948
0
8.03
false
false
false
true
1.176838
0.66203
66.203008
0.499994
28.439824
0.093656
9.365559
0.259228
1.230425
0.381812
5.126562
0.361453
29.05031
false
false
2024-09-28
2024-09-29
0
xukp20/Llama-3-8B-Instruct-SPPO-Iter3_gp_8b-table
xukp20_Llama-3-8B-Instruct-SPPO-score-Iter3_bt_2b-table-0.001_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/xukp20/Llama-3-8B-Instruct-SPPO-score-Iter3_bt_2b-table-0.001" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">xukp20/Llama-3-8B-Instruct-SPPO-score-Iter3_bt_2b-table-0.001</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/xukp20__Llama-3-8B-Instruct-SPPO-score-Iter3_bt_2b-table-0.001-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
xukp20/Llama-3-8B-Instruct-SPPO-score-Iter3_bt_2b-table-0.001
a478aa202c59773eba615ae37feb4cc750757695
20.905341
0
8.03
false
false
false
true
1.172886
0.533636
53.363631
0.491487
27.145374
0.098187
9.818731
0.259228
1.230425
0.377969
4.71276
0.36245
29.161126
false
false
2024-09-28
2024-09-29
0
xukp20/Llama-3-8B-Instruct-SPPO-score-Iter3_bt_2b-table-0.001
xukp20_Llama-3-8B-Instruct-SPPO-score-Iter3_bt_8b-table-0.002_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/xukp20/Llama-3-8B-Instruct-SPPO-score-Iter3_bt_8b-table-0.002" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">xukp20/Llama-3-8B-Instruct-SPPO-score-Iter3_bt_8b-table-0.002</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/xukp20__Llama-3-8B-Instruct-SPPO-score-Iter3_bt_8b-table-0.002-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
xukp20/Llama-3-8B-Instruct-SPPO-score-Iter3_bt_8b-table-0.002
8ef9ef7e2bf522e707a7b090af55f2ec1eafd4b9
23.550849
0
8.03
false
false
false
true
1.738948
0.685161
68.516093
0.507516
29.74055
0.071752
7.175227
0.258389
1.118568
0.383177
5.630469
0.362118
29.124187
false
false
2024-09-28
2024-09-29
0
xukp20/Llama-3-8B-Instruct-SPPO-score-Iter3_bt_8b-table-0.002
xukp20_Llama-3-8B-Instruct-SPPO-score-Iter3_gp_2b-table-0.001_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/xukp20/Llama-3-8B-Instruct-SPPO-score-Iter3_gp_2b-table-0.001" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">xukp20/Llama-3-8B-Instruct-SPPO-score-Iter3_gp_2b-table-0.001</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/xukp20__Llama-3-8B-Instruct-SPPO-score-Iter3_gp_2b-table-0.001-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
xukp20/Llama-3-8B-Instruct-SPPO-score-Iter3_gp_2b-table-0.001
86673872245ad902f8d466bdc20edae9c115b965
20.774868
0
8.03
false
false
false
true
1.350188
0.548224
54.822427
0.488717
26.839803
0.089124
8.912387
0.260906
1.454139
0.363271
2.942187
0.367104
29.678265
false
false
2024-09-28
2024-09-29
0
xukp20/Llama-3-8B-Instruct-SPPO-score-Iter3_gp_2b-table-0.001
xukp20_llama-3-8b-instruct-sppo-iter1-gp-2b-tau01-table_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/xukp20/llama-3-8b-instruct-sppo-iter1-gp-2b-tau01-table" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">xukp20/llama-3-8b-instruct-sppo-iter1-gp-2b-tau01-table</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/xukp20__llama-3-8b-instruct-sppo-iter1-gp-2b-tau01-table-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
xukp20/llama-3-8b-instruct-sppo-iter1-gp-2b-tau01-table
abb3afe2b0398b24ed823b0124c8a72d354487bd
23.684961
0
8.03
false
false
false
true
2.067883
0.690007
69.000696
0.497846
28.119887
0.104985
10.498489
0.259228
1.230425
0.367333
3.083333
0.371592
30.176936
false
false
2024-09-22
2024-09-23
0
xukp20/llama-3-8b-instruct-sppo-iter1-gp-2b-tau01-table
xwen-team_Xwen-7B-Chat_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/xwen-team/Xwen-7B-Chat" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">xwen-team/Xwen-7B-Chat</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/xwen-team__Xwen-7B-Chat-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
xwen-team/Xwen-7B-Chat
d7318b170105d022ab3c5a5d56e385a838f1fae9
31.576752
apache-2.0
32
7.616
true
false
false
true
2.055635
0.68641
68.640984
0.506763
30.921633
0.450906
45.090634
0.260906
1.454139
0.391427
6.795052
0.429023
36.558067
false
false
2025-01-31
2025-02-13
1
Qwen/Qwen2.5-7B
xxx777xxxASD_L3.1-ClaudeMaid-4x8B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MixtralForCausalLM
<a target="_blank" href="https://huggingface.co/xxx777xxxASD/L3.1-ClaudeMaid-4x8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">xxx777xxxASD/L3.1-ClaudeMaid-4x8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/xxx777xxxASD__L3.1-ClaudeMaid-4x8B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
xxx777xxxASD/L3.1-ClaudeMaid-4x8B
2a98d9cb91c7aa775acbf5bfe7bb91beb2faf682
26.404881
llama3.1
7
24.942
true
true
false
true
4.752369
0.669649
66.964875
0.507085
29.437348
0.141239
14.123867
0.291107
5.480984
0.428937
13.750521
0.358045
28.67169
false
false
2024-07-27
2024-07-28
0
xxx777xxxASD/L3.1-ClaudeMaid-4x8B
yam-peleg_Hebrew-Gemma-11B-Instruct_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
GemmaForCausalLM
<a target="_blank" href="https://huggingface.co/yam-peleg/Hebrew-Gemma-11B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">yam-peleg/Hebrew-Gemma-11B-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/yam-peleg__Hebrew-Gemma-11B-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
yam-peleg/Hebrew-Gemma-11B-Instruct
a40259d1efbcac4829ed44d3b589716f615ed362
14.058232
other
23
10.475
true
false
false
true
3.874534
0.302077
30.207738
0.403578
16.862741
0.06571
6.570997
0.276007
3.467562
0.408854
9.973438
0.255402
17.266918
false
false
2024-03-06
2024-07-31
0
yam-peleg/Hebrew-Gemma-11B-Instruct
yam-peleg_Hebrew-Mistral-7B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/yam-peleg/Hebrew-Mistral-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">yam-peleg/Hebrew-Mistral-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/yam-peleg__Hebrew-Mistral-7B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
yam-peleg/Hebrew-Mistral-7B
3d32134b5959492fd7efbbf16395352594bc89f7
13.302117
apache-2.0
69
7.504
true
false
false
false
2.094213
0.232834
23.283443
0.433404
20.17694
0.049849
4.984894
0.279362
3.914989
0.397656
7.673698
0.278009
19.778738
false
false
2024-04-26
2024-07-11
0
yam-peleg/Hebrew-Mistral-7B
yam-peleg_Hebrew-Mistral-7B-200K_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/yam-peleg/Hebrew-Mistral-7B-200K" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">yam-peleg/Hebrew-Mistral-7B-200K</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/yam-peleg__Hebrew-Mistral-7B-200K-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
yam-peleg/Hebrew-Mistral-7B-200K
7b51c7b31e3d9e29ea964c579a45233cfad255fe
10.644291
apache-2.0
15
7.504
true
false
false
false
0.735312
0.185573
18.557317
0.414927
17.493603
0.023414
2.34139
0.276007
3.467562
0.376479
4.526563
0.257314
17.479314
false
false
2024-05-05
2024-07-11
0
yam-peleg/Hebrew-Mistral-7B-200K
yam-peleg_Hebrew-Mistral-7B-200K_bfloat16
bfloat16
🟩 continuously pretrained
🟩
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/yam-peleg/Hebrew-Mistral-7B-200K" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">yam-peleg/Hebrew-Mistral-7B-200K</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/yam-peleg__Hebrew-Mistral-7B-200K-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
yam-peleg/Hebrew-Mistral-7B-200K
7b51c7b31e3d9e29ea964c579a45233cfad255fe
8.386669
apache-2.0
15
7.504
true
false
false
true
2.669663
0.17698
17.698041
0.34105
7.671324
0.030967
3.096677
0.253356
0.447427
0.374
4.416667
0.252909
16.989879
false
false
2024-05-05
2024-08-06
0
yam-peleg/Hebrew-Mistral-7B-200K
yanng1242_Marcoro14-7B-slerp_float16
float16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/yanng1242/Marcoro14-7B-slerp" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">yanng1242/Marcoro14-7B-slerp</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/yanng1242__Marcoro14-7B-slerp-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
yanng1242/Marcoro14-7B-slerp
187c9df776cb1191a30a2f09737160316f56e875
21.933478
apache-2.0
0
7.242
true
false
false
false
0.846103
0.405992
40.599166
0.525166
32.975282
0.074773
7.477341
0.314597
8.612975
0.468625
17.844792
0.316822
24.091312
true
false
2025-01-19
2025-01-19
0
yanng1242/Marcoro14-7B-slerp
yasserrmd_Coder-GRPO-3B_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/yasserrmd/Coder-GRPO-3B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">yasserrmd/Coder-GRPO-3B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/yasserrmd__Coder-GRPO-3B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
yasserrmd/Coder-GRPO-3B
1a6217ef44d7eeefdaa10290457502d68233a989
25.914051
apache-2.0
1
3.086
true
false
false
true
0.751004
0.620764
62.076402
0.446912
22.912311
0.320242
32.024169
0.277685
3.691275
0.411458
10.365625
0.319731
24.414524
false
false
2025-02-08
2025-03-05
1
yasserrmd/Coder-GRPO-3B (Merge)
yasserrmd_Text2SQL-1.5B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/yasserrmd/Text2SQL-1.5B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">yasserrmd/Text2SQL-1.5B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/yasserrmd__Text2SQL-1.5B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
yasserrmd/Text2SQL-1.5B
aeef22ad5852dcf530b5e012a935a948d46d0e96
13.233972
apache-2.0
5
1.544
true
false
false
false
0.565518
0.285741
28.574072
0.385772
13.67572
0.067976
6.797583
0.287752
5.033557
0.39424
10.179948
0.236287
15.142952
false
false
2025-03-06
2025-03-06
3
Qwen/Qwen2.5-Coder-1.5B-Instruct (Merge)
ycros_BagelMIsteryTour-v2-8x7B_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MixtralForCausalLM
<a target="_blank" href="https://huggingface.co/ycros/BagelMIsteryTour-v2-8x7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ycros/BagelMIsteryTour-v2-8x7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ycros__BagelMIsteryTour-v2-8x7B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ycros/BagelMIsteryTour-v2-8x7B
98a8b319707be3dab1659594da69a37ed8f8c148
24.258614
cc-by-nc-4.0
16
46.703
true
false
false
true
3.649132
0.599432
59.943173
0.515924
31.699287
0.07855
7.854985
0.30453
7.270694
0.420292
11.303125
0.347324
27.480423
true
false
2024-01-19
2024-06-28
1
ycros/BagelMIsteryTour-v2-8x7B (Merge)
ycros_BagelMIsteryTour-v2-8x7B_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
MixtralForCausalLM
<a target="_blank" href="https://huggingface.co/ycros/BagelMIsteryTour-v2-8x7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ycros/BagelMIsteryTour-v2-8x7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ycros__BagelMIsteryTour-v2-8x7B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ycros/BagelMIsteryTour-v2-8x7B
98a8b319707be3dab1659594da69a37ed8f8c148
24.825507
cc-by-nc-4.0
16
46.703
true
false
false
true
7.238673
0.62621
62.620957
0.514194
31.366123
0.093656
9.365559
0.307886
7.718121
0.41375
10.31875
0.348072
27.563534
true
false
2024-01-19
2024-08-04
1
ycros/BagelMIsteryTour-v2-8x7B (Merge)
yfzp_Llama-3-8B-Instruct-SPPO-Iter1_bt_2b-table_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/yfzp/Llama-3-8B-Instruct-SPPO-Iter1_bt_2b-table" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">yfzp/Llama-3-8B-Instruct-SPPO-Iter1_bt_2b-table</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/yfzp__Llama-3-8B-Instruct-SPPO-Iter1_bt_2b-table-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
yfzp/Llama-3-8B-Instruct-SPPO-Iter1_bt_2b-table
97b2d0e790a6fcdf39c34a2043f0818368c7dcb3
23.616565
0
8.03
false
false
false
true
1.236506
0.670898
67.089766
0.498661
28.170107
0.111782
11.178248
0.259228
1.230425
0.372698
3.853906
0.371592
30.176936
false
false
2024-09-29
2024-09-30
0
yfzp/Llama-3-8B-Instruct-SPPO-Iter1_bt_2b-table
yfzp_Llama-3-8B-Instruct-SPPO-Iter1_bt_8b-table_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/yfzp/Llama-3-8B-Instruct-SPPO-Iter1_bt_8b-table" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">yfzp/Llama-3-8B-Instruct-SPPO-Iter1_bt_8b-table</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/yfzp__Llama-3-8B-Instruct-SPPO-Iter1_bt_8b-table-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
yfzp/Llama-3-8B-Instruct-SPPO-Iter1_bt_8b-table
e8786291c206d5cd1b01d29466e3b397278f4e2b
24.978481
0
8.03
false
false
false
true
1.281326
0.733271
73.327105
0.508036
29.308128
0.103474
10.347432
0.260067
1.342282
0.380604
5.008854
0.374834
30.537086
false
false
2024-09-29
2024-09-30
0
yfzp/Llama-3-8B-Instruct-SPPO-Iter1_bt_8b-table
yfzp_Llama-3-8B-Instruct-SPPO-Iter1_gp_2b-table_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/yfzp/Llama-3-8B-Instruct-SPPO-Iter1_gp_2b-table" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">yfzp/Llama-3-8B-Instruct-SPPO-Iter1_gp_2b-table</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/yfzp__Llama-3-8B-Instruct-SPPO-Iter1_gp_2b-table-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
yfzp/Llama-3-8B-Instruct-SPPO-Iter1_gp_2b-table
0d9cb29aa87b0c17ed011ffbc83803f3f6dd18e7
23.45764
0
8.03
false
false
false
true
1.359109
0.678466
67.846647
0.494121
27.469588
0.112538
11.253776
0.259228
1.230425
0.364667
2.75
0.371759
30.195405
false
false
2024-09-29
2024-09-29
0
yfzp/Llama-3-8B-Instruct-SPPO-Iter1_gp_2b-table
yfzp_Llama-3-8B-Instruct-SPPO-Iter1_gp_8b-table_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/yfzp/Llama-3-8B-Instruct-SPPO-Iter1_gp_8b-table" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">yfzp/Llama-3-8B-Instruct-SPPO-Iter1_gp_8b-table</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/yfzp__Llama-3-8B-Instruct-SPPO-Iter1_gp_8b-table-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
yfzp/Llama-3-8B-Instruct-SPPO-Iter1_gp_8b-table
7a326a956e6169b287a04ef93cdc0342a0f3311a
24.089793
0
8.03
false
false
false
true
1.296368
0.713188
71.318768
0.502536
28.604424
0.098943
9.89426
0.259228
1.230425
0.371333
3.683333
0.368268
29.80755
false
false
2024-09-29
2024-09-29
0
yfzp/Llama-3-8B-Instruct-SPPO-Iter1_gp_8b-table
yfzp_Llama-3-8B-Instruct-SPPO-score-Iter1_bt_2b-table-0.001_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/yfzp/Llama-3-8B-Instruct-SPPO-score-Iter1_bt_2b-table-0.001" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">yfzp/Llama-3-8B-Instruct-SPPO-score-Iter1_bt_2b-table-0.001</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/yfzp__Llama-3-8B-Instruct-SPPO-score-Iter1_bt_2b-table-0.001-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
yfzp/Llama-3-8B-Instruct-SPPO-score-Iter1_bt_2b-table-0.001
e5c8baadbf6ce17b344596ad42bd3546f66e253e
23.246035
0
8.03
false
false
false
true
1.16447
0.649565
64.956538
0.497946
28.099199
0.101208
10.120846
0.259228
1.230425
0.377969
4.846094
0.372008
30.223109
false
false
2024-09-29
2024-09-30
0
yfzp/Llama-3-8B-Instruct-SPPO-score-Iter1_bt_2b-table-0.001
yfzp_Llama-3-8B-Instruct-SPPO-score-Iter1_bt_8b-table-0.002_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/yfzp/Llama-3-8B-Instruct-SPPO-score-Iter1_bt_8b-table-0.002" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">yfzp/Llama-3-8B-Instruct-SPPO-score-Iter1_bt_8b-table-0.002</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/yfzp__Llama-3-8B-Instruct-SPPO-score-Iter1_bt_8b-table-0.002-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
yfzp/Llama-3-8B-Instruct-SPPO-score-Iter1_bt_8b-table-0.002
064e237b850151938caf171a4c8c7e34c93e580e
24.470597
0
8.03
false
false
false
true
1.212043
0.719607
71.960731
0.504515
28.785911
0.087613
8.761329
0.260067
1.342282
0.383146
5.593229
0.373421
30.380098
false
false
2024-09-29
2024-09-30
0
yfzp/Llama-3-8B-Instruct-SPPO-score-Iter1_bt_8b-table-0.002
yfzp_Llama-3-8B-Instruct-SPPO-score-Iter1_gp_2b-table-0.001_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/yfzp/Llama-3-8B-Instruct-SPPO-score-Iter1_gp_2b-table-0.001" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">yfzp/Llama-3-8B-Instruct-SPPO-score-Iter1_gp_2b-table-0.001</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/yfzp__Llama-3-8B-Instruct-SPPO-score-Iter1_gp_2b-table-0.001-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
yfzp/Llama-3-8B-Instruct-SPPO-score-Iter1_gp_2b-table-0.001
b685b90063258e05f8b4930fdbce2e565f13f620
22.724716
0
8.03
false
false
false
true
1.298184
0.65044
65.043972
0.495788
27.825253
0.093656
9.365559
0.259228
1.230425
0.366031
2.853906
0.370263
30.029181
false
false
2024-09-29
2024-09-29
0
yfzp/Llama-3-8B-Instruct-SPPO-score-Iter1_gp_2b-table-0.001
yfzp_Llama-3-8B-Instruct-SPPO-score-Iter1_gp_8b-table-0.002_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/yfzp/Llama-3-8B-Instruct-SPPO-score-Iter1_gp_8b-table-0.002" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">yfzp/Llama-3-8B-Instruct-SPPO-score-Iter1_gp_8b-table-0.002</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/yfzp__Llama-3-8B-Instruct-SPPO-score-Iter1_gp_8b-table-0.002-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
yfzp/Llama-3-8B-Instruct-SPPO-score-Iter1_gp_8b-table-0.002
5ab3f2cfc96bdda3b5a629ab4a81adf7394ba90a
23.749108
0
8.03
false
false
false
true
1.21538
0.701597
70.159732
0.499155
28.120615
0.086858
8.685801
0.259228
1.230425
0.377906
4.638281
0.366938
29.659796
false
false
2024-09-29
2024-09-29
0
yfzp/Llama-3-8B-Instruct-SPPO-score-Iter1_gp_8b-table-0.002
yifAI_Llama-3-8B-Instruct-SPPO-score-Iter3_gp_8b-table-0.002_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/yifAI/Llama-3-8B-Instruct-SPPO-score-Iter3_gp_8b-table-0.002" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">yifAI/Llama-3-8B-Instruct-SPPO-score-Iter3_gp_8b-table-0.002</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/yifAI__Llama-3-8B-Instruct-SPPO-score-Iter3_gp_8b-table-0.002-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
yifAI/Llama-3-8B-Instruct-SPPO-score-Iter3_gp_8b-table-0.002
7a046b74179225d6055dd8aa601b5234f817b1e5
22.738075
0
8.03
false
false
false
true
1.344031
0.648966
64.896586
0.491452
27.281064
0.075529
7.55287
0.261745
1.565996
0.389875
7.134375
0.351978
27.997562
false
false
2024-09-30
0
Removed
ylalain_ECE-PRYMMAL-YL-1B-SLERP-V8_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/ylalain/ECE-PRYMMAL-YL-1B-SLERP-V8" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ylalain/ECE-PRYMMAL-YL-1B-SLERP-V8</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ylalain__ECE-PRYMMAL-YL-1B-SLERP-V8-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ylalain/ECE-PRYMMAL-YL-1B-SLERP-V8
2c00dbc74e55d42fbc8b08f474fb9568f820edb9
9.679668
apache-2.0
0
1.357
true
false
false
false
1.096855
0.150527
15.052727
0.397557
15.175392
0.004532
0.453172
0.28943
5.257271
0.387458
6.765625
0.238364
15.373818
false
false
2024-11-13
2024-11-13
0
ylalain/ECE-PRYMMAL-YL-1B-SLERP-V8
ymcki_Llama-3.1-8B-GRPO-Instruct_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/ymcki/Llama-3.1-8B-GRPO-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ymcki/Llama-3.1-8B-GRPO-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ymcki__Llama-3.1-8B-GRPO-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ymcki/Llama-3.1-8B-GRPO-Instruct
ae73ec53fb75499a33a506b354b55b29d02392b9
28.168376
llama3.1
0
8.03
true
false
false
true
0.656791
0.744537
74.453672
0.513159
30.353177
0.202417
20.241692
0.294463
5.928412
0.381656
7.607031
0.373836
30.426271
false
false
2025-02-20
2025-02-20
2
meta-llama/Meta-Llama-3.1-8B
ymcki_Llama-3.1-8B-SFT-GRPO-Instruct_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/ymcki/Llama-3.1-8B-SFT-GRPO-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ymcki/Llama-3.1-8B-SFT-GRPO-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ymcki__Llama-3.1-8B-SFT-GRPO-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ymcki/Llama-3.1-8B-SFT-GRPO-Instruct
9d8bfc910b2be95b38a3738a938c7abf575892ac
7.659155
llama3.1
0
8.03
true
false
false
true
0.752124
0.3354
33.540007
0.312626
4.467783
0.04003
4.003021
0.253356
0.447427
0.352604
2.408854
0.109791
1.08784
false
false
2025-03-12
2025-03-12
2
meta-llama/Meta-Llama-3.1-8B
ymcki_gemma-2-2b-ORPO-jpn-it-abliterated-18_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/ymcki/gemma-2-2b-ORPO-jpn-it-abliterated-18" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ymcki/gemma-2-2b-ORPO-jpn-it-abliterated-18</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ymcki__gemma-2-2b-ORPO-jpn-it-abliterated-18-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ymcki/gemma-2-2b-ORPO-jpn-it-abliterated-18
aed2a9061ffa21beaec0d617a9605e160136aab4
15.288363
gemma
0
2.614
true
false
false
true
7.227142
0.463095
46.309459
0.40529
16.301992
0.043051
4.305136
0.288591
5.145414
0.375427
4.728385
0.234458
14.93979
false
false
2024-10-30
2024-11-16
3
google/gemma-2-2b
ymcki_gemma-2-2b-ORPO-jpn-it-abliterated-18-merge_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/ymcki/gemma-2-2b-ORPO-jpn-it-abliterated-18-merge" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ymcki/gemma-2-2b-ORPO-jpn-it-abliterated-18-merge</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ymcki__gemma-2-2b-ORPO-jpn-it-abliterated-18-merge-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ymcki/gemma-2-2b-ORPO-jpn-it-abliterated-18-merge
b72be0a7879f0d82cb2024cfc1d02c370ce3efe8
16.505538
gemma
0
2.614
true
false
false
true
2.786552
0.521821
52.182099
0.414689
17.348337
0.054381
5.438066
0.283557
4.474273
0.351396
3.357813
0.246094
16.232639
false
false
2024-10-30
2024-11-16
3
google/gemma-2-2b
ymcki_gemma-2-2b-jpn-it-abliterated-17_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/ymcki/gemma-2-2b-jpn-it-abliterated-17" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ymcki/gemma-2-2b-jpn-it-abliterated-17</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ymcki__gemma-2-2b-jpn-it-abliterated-17-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ymcki/gemma-2-2b-jpn-it-abliterated-17
e6f82b93dae0b8207aa3252ab4157182e2610787
15.644976
gemma
1
2.614
true
false
false
true
1.988912
0.508157
50.815724
0.407627
16.234749
0.03852
3.851964
0.271812
2.908277
0.370062
3.891146
0.245512
16.167996
false
false
2024-10-16
2024-10-18
3
google/gemma-2-2b
ymcki_gemma-2-2b-jpn-it-abliterated-17-18-24_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/ymcki/gemma-2-2b-jpn-it-abliterated-17-18-24" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ymcki/gemma-2-2b-jpn-it-abliterated-17-18-24</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ymcki__gemma-2-2b-jpn-it-abliterated-17-18-24-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ymcki/gemma-2-2b-jpn-it-abliterated-17-18-24
38f56fcb99bd64278a1d90dd23aea527036329a0
14.447761
gemma
0
2.614
true
false
false
true
1.409717
0.505484
50.548434
0.381236
13.114728
0.02568
2.567976
0.28104
4.138702
0.350156
2.069531
0.228225
14.247193
false
false
2024-11-06
2024-11-06
3
google/gemma-2-2b
ymcki_gemma-2-2b-jpn-it-abliterated-17-ORPO_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/ymcki/gemma-2-2b-jpn-it-abliterated-17-ORPO" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ymcki/gemma-2-2b-jpn-it-abliterated-17-ORPO</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ymcki__gemma-2-2b-jpn-it-abliterated-17-ORPO-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ymcki/gemma-2-2b-jpn-it-abliterated-17-ORPO
531b2e2043285cb40cd0433f5ad43441f8ac6b6c
14.844142
gemma
1
2.614
true
false
false
true
11.290834
0.474785
47.478468
0.389798
14.389413
0.061934
6.193353
0.274329
3.243848
0.37676
4.528385
0.219082
13.231383
false
false
2024-10-18
2024-10-27
3
google/gemma-2-2b
ymcki_gemma-2-2b-jpn-it-abliterated-17-ORPO-alpaca_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/ymcki/gemma-2-2b-jpn-it-abliterated-17-ORPO-alpaca" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ymcki/gemma-2-2b-jpn-it-abliterated-17-ORPO-alpaca</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ymcki__gemma-2-2b-jpn-it-abliterated-17-ORPO-alpaca-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ymcki/gemma-2-2b-jpn-it-abliterated-17-ORPO-alpaca
5503b5e892be463fa4b1d265b8ba9ba4304af012
12.530432
0
2.614
false
false
false
true
2.369333
0.306473
30.647349
0.40716
16.922412
0.032477
3.247734
0.269295
2.572707
0.396917
7.914583
0.2249
13.877807
false
false
2024-10-27
0
Removed
ymcki_gemma-2-2b-jpn-it-abliterated-18_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/ymcki/gemma-2-2b-jpn-it-abliterated-18" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ymcki/gemma-2-2b-jpn-it-abliterated-18</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ymcki__gemma-2-2b-jpn-it-abliterated-18-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ymcki/gemma-2-2b-jpn-it-abliterated-18
c50b85f9b60b444f85fe230b8d77fcbc7b18ef91
16.245944
gemma
1
2.614
true
false
false
true
1.906289
0.517525
51.752461
0.413219
17.143415
0.044562
4.456193
0.27349
3.131991
0.374156
4.269531
0.250499
16.722074
false
false
2024-10-15
2024-10-18
3
google/gemma-2-2b
ymcki_gemma-2-2b-jpn-it-abliterated-18-ORPO_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/ymcki/gemma-2-2b-jpn-it-abliterated-18-ORPO" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ymcki/gemma-2-2b-jpn-it-abliterated-18-ORPO</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ymcki__gemma-2-2b-jpn-it-abliterated-18-ORPO-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ymcki/gemma-2-2b-jpn-it-abliterated-18-ORPO
b9f41f53827b8a5a600546b41f63023bf84617a3
15.132294
gemma
0
2.614
true
false
false
true
3.220754
0.474235
47.423503
0.403894
16.538079
0.046828
4.682779
0.261745
1.565996
0.395333
7.416667
0.218501
13.166741
false
false
2024-10-22
2024-10-22
3
google/gemma-2-2b
ymcki_gemma-2-2b-jpn-it-abliterated-24_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/ymcki/gemma-2-2b-jpn-it-abliterated-24" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ymcki/gemma-2-2b-jpn-it-abliterated-24</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ymcki__gemma-2-2b-jpn-it-abliterated-24-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ymcki/gemma-2-2b-jpn-it-abliterated-24
06c129ba5261ee88e32035c88f90ca11d835175d
16.334187
gemma
0
2.614
true
false
false
true
1.620885
0.497866
49.786566
0.41096
16.77259
0.043807
4.380665
0.277685
3.691275
0.39149
7.002865
0.24734
16.371158
false
false
2024-10-24
2024-10-25
3
google/gemma-2-2b
yuchenxie_ArlowGPT-3B-Multilingual_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/yuchenxie/ArlowGPT-3B-Multilingual" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">yuchenxie/ArlowGPT-3B-Multilingual</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/yuchenxie__ArlowGPT-3B-Multilingual-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
yuchenxie/ArlowGPT-3B-Multilingual
336f9084b4718be34ec7348e8082670539aebb4c
20.501175
mit
1
3.213
true
false
false
true
1.223644
0.639549
63.954862
0.43014
19.50317
0.112538
11.253776
0.280201
4.026846
0.372667
4.083333
0.281666
20.185062
false
false
2024-11-03
2025-01-12
1
yuchenxie/ArlowGPT-3B-Multilingual (Merge)
yuchenxie_ArlowGPT-8B_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/yuchenxie/ArlowGPT-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">yuchenxie/ArlowGPT-8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/yuchenxie__ArlowGPT-8B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
yuchenxie/ArlowGPT-8B
f7d0149059f1324a7725676b6ab67df59cd4c599
28.973572
mit
3
8.03
true
false
false
true
1.43356
0.784654
78.465361
0.508016
29.842909
0.203927
20.392749
0.293624
5.816555
0.388229
8.361979
0.378657
30.961879
false
false
2024-10-05
2025-01-12
1
yuchenxie/ArlowGPT-8B (Merge)
yuvraj17_Llama3-8B-SuperNova-Spectrum-Hermes-DPO_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/yuvraj17/Llama3-8B-SuperNova-Spectrum-Hermes-DPO" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">yuvraj17/Llama3-8B-SuperNova-Spectrum-Hermes-DPO</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/yuvraj17__Llama3-8B-SuperNova-Spectrum-Hermes-DPO-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
yuvraj17/Llama3-8B-SuperNova-Spectrum-Hermes-DPO
0da9f780f7dd94ed1e10c8d3e082472ff2922177
18.088168
apache-2.0
0
8.03
true
false
false
true
1.944059
0.46909
46.908979
0.439987
21.238563
0.056647
5.664653
0.302013
6.935123
0.401219
9.61901
0.263464
18.162677
false
false
2024-09-24
2024-09-30
0
yuvraj17/Llama3-8B-SuperNova-Spectrum-Hermes-DPO
yuvraj17_Llama3-8B-SuperNova-Spectrum-dare_ties_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/yuvraj17/Llama3-8B-SuperNova-Spectrum-dare_ties" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">yuvraj17/Llama3-8B-SuperNova-Spectrum-dare_ties</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/yuvraj17__Llama3-8B-SuperNova-Spectrum-dare_ties-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
yuvraj17/Llama3-8B-SuperNova-Spectrum-dare_ties
998d15b32900bc230727c8a7984e005f611723e9
19.172565
apache-2.0
0
8.03
true
false
false
false
1.828288
0.401271
40.127085
0.461579
23.492188
0.084592
8.459215
0.275168
3.355705
0.421094
11.003385
0.35738
28.597813
true
false
2024-09-22
2024-09-23
1
yuvraj17/Llama3-8B-SuperNova-Spectrum-dare_ties (Merge)
yuvraj17_Llama3-8B-abliterated-Spectrum-slerp_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/yuvraj17/Llama3-8B-abliterated-Spectrum-slerp" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">yuvraj17/Llama3-8B-abliterated-Spectrum-slerp</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/yuvraj17__Llama3-8B-abliterated-Spectrum-slerp-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
yuvraj17/Llama3-8B-abliterated-Spectrum-slerp
28789950975ecf5aac846c3f2c0a5d6841651ee6
17.725316
apache-2.0
0
8.03
true
false
false
false
1.65332
0.288488
28.848788
0.497791
28.54693
0.060423
6.042296
0.301174
6.823266
0.399823
11.011198
0.325715
25.079418
true
false
2024-09-22
2024-09-23
1
yuvraj17/Llama3-8B-abliterated-Spectrum-slerp (Merge)
zake7749_gemma-2-2b-it-chinese-kyara-dpo_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/zake7749/gemma-2-2b-it-chinese-kyara-dpo" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zake7749/gemma-2-2b-it-chinese-kyara-dpo</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zake7749__gemma-2-2b-it-chinese-kyara-dpo-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
zake7749/gemma-2-2b-it-chinese-kyara-dpo
bbc011dae0416c1664a0287f3a7a0f9563deac91
19.624112
gemma
11
2.614
true
false
false
false
2.558618
0.538208
53.820751
0.425746
19.061804
0.083837
8.383686
0.266779
2.237136
0.457563
16.761979
0.257314
17.479314
false
false
2024-08-18
2024-10-17
1
zake7749/gemma-2-2b-it-chinese-kyara-dpo (Merge)
zake7749_gemma-2-9b-it-chinese-kyara_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/zake7749/gemma-2-9b-it-chinese-kyara" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zake7749/gemma-2-9b-it-chinese-kyara</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zake7749__gemma-2-9b-it-chinese-kyara-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
zake7749/gemma-2-9b-it-chinese-kyara
6f440abe1e2fde914e6607e2b6c5b04cc69c51f4
21.383182
gemma
0
9.242
true
false
false
true
2.06779
0.17643
17.642965
0.595369
41.100636
0.104985
10.498489
0.338087
11.744966
0.424198
11.991406
0.417886
35.320626
false
false
2025-02-17
2025-02-24
1
zake7749/gemma-2-9b-it-chinese-kyara (Merge)
zelk12_Gemma-2-TM-9B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/zelk12/Gemma-2-TM-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/Gemma-2-TM-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__Gemma-2-TM-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
zelk12/Gemma-2-TM-9B
42366d605e6bdad354a5632547e37d34d300ff7a
33.525544
0
10.159
false
false
false
true
3.935785
0.804462
80.446216
0.598659
42.049491
0.202417
20.241692
0.346477
12.863535
0.41524
11.238281
0.408826
34.314051
false
false
2024-11-06
2024-11-06
1
zelk12/Gemma-2-TM-9B (Merge)
zelk12_MT-Gen1-gemma-2-9B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/zelk12/MT-Gen1-gemma-2-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/MT-Gen1-gemma-2-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__MT-Gen1-gemma-2-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
zelk12/MT-Gen1-gemma-2-9B
b78f8883614cbbdf182ebb4acf8a8c124bc782ae
34.514166
0
10.159
false
false
false
true
6.725493
0.788625
78.862529
0.61
44.011247
0.222054
22.205438
0.346477
12.863535
0.421688
11.577604
0.438082
37.564642
false
false
2024-10-23
2024-10-23
1
zelk12/MT-Gen1-gemma-2-9B (Merge)
zelk12_MT-Gen2-GI-gemma-2-9B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/zelk12/MT-Gen2-GI-gemma-2-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/MT-Gen2-GI-gemma-2-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__MT-Gen2-GI-gemma-2-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
zelk12/MT-Gen2-GI-gemma-2-9B
e970fbcbf974f4626dcc6db7d2b02d4f24c72744
34.763481
1
10.159
false
false
false
true
3.737012
0.791398
79.139794
0.609556
44.002591
0.220544
22.054381
0.350671
13.422819
0.428323
12.673698
0.435588
37.287603
false
false
2024-11-10
2024-11-28
1
zelk12/MT-Gen2-GI-gemma-2-9B (Merge)
zelk12_MT-Gen2-gemma-2-9B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/zelk12/MT-Gen2-gemma-2-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/MT-Gen2-gemma-2-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__MT-Gen2-gemma-2-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
zelk12/MT-Gen2-gemma-2-9B
c723f8b9b7334fddd1eb8b6e5230b76fb18139a5
34.81519
1
10.159
false
false
false
true
3.978895
0.790749
79.074855
0.610049
44.107782
0.219033
21.903323
0.346477
12.863535
0.432292
13.303125
0.438747
37.63852
false
false
2024-11-10
2024-11-10
1
zelk12/MT-Gen2-gemma-2-9B (Merge)
zelk12_MT-Gen3-gemma-2-9B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/zelk12/MT-Gen3-gemma-2-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/MT-Gen3-gemma-2-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__MT-Gen3-gemma-2-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
zelk12/MT-Gen3-gemma-2-9B
84627594655776ce67f1e01233113b658333fa71
34.862851
2
10.159
false
false
false
true
3.626496
0.802014
80.201421
0.609711
43.950648
0.229607
22.960725
0.348993
13.199105
0.421688
11.577604
0.435588
37.287603
false
false
2024-11-28
2024-11-30
1
zelk12/MT-Gen3-gemma-2-9B (Merge)