eval_name
stringlengths
12
111
Precision
stringclasses
3 values
Type
stringclasses
7 values
T
stringclasses
7 values
Weight type
stringclasses
2 values
Architecture
stringclasses
64 values
Model
stringlengths
355
689
fullname
stringlengths
4
102
Model sha
stringlengths
0
40
Average ⬆️
float64
0.74
52.1
Hub License
stringclasses
27 values
Hub ❤️
int64
0
6.09k
#Params (B)
float64
-1
141
Available on the hub
bool
2 classes
MoE
bool
2 classes
Flagged
bool
2 classes
Chat Template
bool
2 classes
CO₂ cost (kg)
float64
0.04
187
IFEval Raw
float64
0
0.9
IFEval
float64
0
90
BBH Raw
float64
0.22
0.83
BBH
float64
0.25
76.7
MATH Lvl 5 Raw
float64
0
0.71
MATH Lvl 5
float64
0
71.5
GPQA Raw
float64
0.21
0.47
GPQA
float64
0
29.4
MUSR Raw
float64
0.29
0.6
MUSR
float64
0
38.7
MMLU-PRO Raw
float64
0.1
0.73
MMLU-PRO
float64
0
70
Merged
bool
2 classes
Official Providers
bool
2 classes
Upload To Hub Date
stringclasses
525 values
Submission Date
stringclasses
263 values
Generation
int64
0
10
Base Model
stringlengths
4
102
theprint_ReWiz-Llama-3.1-8B-v2_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Adapter
?
<a target="_blank" href="https://huggingface.co/theprint/ReWiz-Llama-3.1-8B-v2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">theprint/ReWiz-Llama-3.1-8B-v2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/theprint__ReWiz-Llama-3.1-8B-v2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
theprint/ReWiz-Llama-3.1-8B-v2
a8b0fc584b10e0110e04f9d21c7f10d24391c1d5
15.893328
apache-2.0
1
9.3
true
false
false
false
4.70618
0.237905
23.790542
0.463243
23.773287
0.057402
5.740181
0.302852
7.04698
0.381375
9.338542
0.331034
25.670434
false
false
2024-11-02
2024-11-03
2
meta-llama/Meta-Llama-3.1-8B
theprint_ReWiz-Llama-3.2-3B_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/theprint/ReWiz-Llama-3.2-3B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">theprint/ReWiz-Llama-3.2-3B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/theprint__ReWiz-Llama-3.2-3B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
theprint/ReWiz-Llama-3.2-3B
e6aed95ad8f104f105b8423cd5f87c75705a828c
18.186254
apache-2.0
3
3.213
true
false
false
false
1.963895
0.464893
46.489315
0.434326
19.293728
0.109517
10.951662
0.283557
4.474273
0.361375
6.938542
0.28873
20.970006
false
false
2024-10-18
2024-10-28
1
theprint/ReWiz-Llama-3.2-3B (Merge)
theprint_ReWiz-Nemo-12B-Instruct_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/theprint/ReWiz-Nemo-12B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">theprint/ReWiz-Nemo-12B-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/theprint__ReWiz-Nemo-12B-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
theprint/ReWiz-Nemo-12B-Instruct
6f8ea24f8d19b48850d68bef1b5c50837d37761b
16.173142
apache-2.0
2
12.248
true
false
false
false
2.350105
0.106238
10.623811
0.509241
29.926389
0.10423
10.422961
0.323826
9.8434
0.409563
10.228646
0.333943
25.993647
false
false
2024-10-31
2024-11-02
1
unsloth/Mistral-Nemo-Instruct-2407-bnb-4bit
theprint_ReWiz-Qwen-2.5-14B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Adapter
?
<a target="_blank" href="https://huggingface.co/theprint/ReWiz-Qwen-2.5-14B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">theprint/ReWiz-Qwen-2.5-14B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/theprint__ReWiz-Qwen-2.5-14B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
theprint/ReWiz-Qwen-2.5-14B
e5524628f15c30d7542427c53a565e6e2d3ff760
30.031734
apache-2.0
5
16.743
true
false
false
false
11.856533
0.278546
27.854648
0.617949
44.861873
0.292296
29.229607
0.380034
17.337808
0.453896
15.436979
0.509225
45.469489
false
false
2024-11-05
2024-11-10
2
Qwen/Qwen2.5-14B
theprint_ReWiz-Worldbuilder-7B_float16
float16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/theprint/ReWiz-Worldbuilder-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">theprint/ReWiz-Worldbuilder-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/theprint__ReWiz-Worldbuilder-7B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
theprint/ReWiz-Worldbuilder-7B
e88c715097d824f115f59a97e612d662ffb1031f
15.7907
0
7.248
false
false
false
false
1.221734
0.25102
25.101952
0.463616
25.076347
0.037009
3.700906
0.269295
2.572707
0.45725
16.389583
0.297124
21.902704
false
false
2024-10-28
2024-10-28
1
theprint/ReWiz-Worldbuilder-7B (Merge)
theprint_RuDolph-Hermes-7B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/theprint/RuDolph-Hermes-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">theprint/RuDolph-Hermes-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/theprint__RuDolph-Hermes-7B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
theprint/RuDolph-Hermes-7B
e07aea56963bbfe5c6753d1056566a56acc30d4a
19.037013
0
7.242
false
false
false
false
1.004134
0.360429
36.042922
0.505293
30.709648
0.05136
5.135952
0.312081
8.277405
0.422615
11.026823
0.307264
23.029329
false
false
2024-11-10
2024-11-10
1
theprint/RuDolph-Hermes-7B (Merge)
theprint_WorldBuilder-12B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Adapter
?
<a target="_blank" href="https://huggingface.co/theprint/WorldBuilder-12B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">theprint/WorldBuilder-12B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/theprint__WorldBuilder-12B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
theprint/WorldBuilder-12B
20cfd0e98fb2628b00867147b2c6f423d27f3561
14.516407
apache-2.0
0
13.933
true
false
false
false
5.66255
0.137438
13.743755
0.50101
29.277996
0.044562
4.456193
0.29698
6.263982
0.406646
8.997396
0.319232
24.359116
false
false
2024-10-27
2024-11-18
1
unsloth/mistral-nemo-base-2407-bnb-4bit
theprint_phi-3-mini-4k-python_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Adapter
?
<a target="_blank" href="https://huggingface.co/theprint/phi-3-mini-4k-python" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">theprint/phi-3-mini-4k-python</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/theprint__phi-3-mini-4k-python-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
theprint/phi-3-mini-4k-python
81453e5718775630581ab9950e6c0ccf0d7a4177
17.728138
apache-2.0
0
4.132
true
false
false
false
2.751102
0.240878
24.087754
0.493759
28.446016
0.104985
10.498489
0.291107
5.480984
0.392167
9.220833
0.357713
28.634752
false
false
2024-06-03
2024-09-13
1
unsloth/Phi-3-mini-4k-instruct-bnb-4bit
thinkcoder_llama3-8b-instruct-lora-8-sft_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/thinkcoder/llama3-8b-instruct-lora-8-sft" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">thinkcoder/llama3-8b-instruct-lora-8-sft</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/thinkcoder__llama3-8b-instruct-lora-8-sft-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
thinkcoder/llama3-8b-instruct-lora-8-sft
b76d81a09b15d92f92a8a22711983775ac999383
22.363644
0
8.03
false
false
false
true
0.71498
0.648042
64.804164
0.486501
27.203773
0.101964
10.196375
0.266779
2.237136
0.323458
2.232292
0.347573
27.508126
false
false
2025-03-10
2025-03-10
0
thinkcoder/llama3-8b-instruct-lora-8-sft
thirdeyeai_elevate360m_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/thirdeyeai/elevate360m" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">thirdeyeai/elevate360m</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/thirdeyeai__elevate360m-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
thirdeyeai/elevate360m
f4321ba8704e732769d328952d217bdb564e1824
1.918188
0
0.362
false
false
false
false
0.737565
0.044489
4.448862
0.296258
2.339847
0.015861
1.586103
0.240772
0
0.346219
2.277344
0.107713
0.856974
false
false
2025-01-28
2025-01-29
0
thirdeyeai/elevate360m
thomas-yanxin_XinYuan-Qwen2-1_5B_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/thomas-yanxin/XinYuan-Qwen2-1_5B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">thomas-yanxin/XinYuan-Qwen2-1_5B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/thomas-yanxin__XinYuan-Qwen2-1_5B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
thomas-yanxin/XinYuan-Qwen2-1_5B
a01b362887832bea08d686737861ac3d5b437a32
11.515091
other
1
1.777
true
false
false
true
2.704728
0.298556
29.855561
0.363549
12.12558
0.067221
6.722054
0.270134
2.684564
0.363396
2.624479
0.235705
15.07831
false
false
2024-08-25
2024-09-04
1
Removed
thomas-yanxin_XinYuan-Qwen2-7B_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/thomas-yanxin/XinYuan-Qwen2-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">thomas-yanxin/XinYuan-Qwen2-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/thomas-yanxin__XinYuan-Qwen2-7B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
thomas-yanxin/XinYuan-Qwen2-7B
c62d83eee2f4812ac17fc17d307f4aa1a77c5359
22.431712
other
1
7.616
true
false
false
true
4.905126
0.44376
44.376033
0.493663
28.401489
0.14577
14.577039
0.291107
5.480984
0.405812
9.259896
0.392453
32.494829
false
false
2024-08-21
2024-09-03
0
thomas-yanxin/XinYuan-Qwen2-7B
thomas-yanxin_XinYuan-Qwen2-7B-0917_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/thomas-yanxin/XinYuan-Qwen2-7B-0917" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">thomas-yanxin/XinYuan-Qwen2-7B-0917</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/thomas-yanxin__XinYuan-Qwen2-7B-0917-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
thomas-yanxin/XinYuan-Qwen2-7B-0917
6cee1b155fca9ae1f558f434953dfdadb9596af0
24.546893
other
4
7.616
true
false
false
true
2.971129
0.37192
37.191984
0.516922
32.619938
0.197885
19.78852
0.309564
7.941834
0.440104
13.679688
0.424535
36.059397
false
false
2024-09-17
2024-09-17
0
thomas-yanxin/XinYuan-Qwen2-7B-0917
thomas-yanxin_XinYuan-Qwen2.5-7B-0917_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/thomas-yanxin/XinYuan-Qwen2.5-7B-0917" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">thomas-yanxin/XinYuan-Qwen2.5-7B-0917</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/thomas-yanxin__XinYuan-Qwen2.5-7B-0917-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
thomas-yanxin/XinYuan-Qwen2.5-7B-0917
bbbeafd1003c4d5e13f09b7223671957384b961a
21.397595
other
4
7.616
true
false
false
true
1.94245
0.357706
35.770644
0.518411
33.439669
0.193353
19.335347
0.28104
4.138702
0.367552
3.677344
0.388215
32.023862
false
false
2024-09-17
2024-09-24
0
thomas-yanxin/XinYuan-Qwen2.5-7B-0917
tianyil1_MistralForCausalLM_Cal_DPO_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Adapter
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/tianyil1/MistralForCausalLM_Cal_DPO" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">tianyil1/MistralForCausalLM_Cal_DPO</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/tianyil1__MistralForCausalLM_Cal_DPO-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
tianyil1/MistralForCausalLM_Cal_DPO
642d91baa32a7806a11bc66e0d65870fbcd15e6c
18.088644
apache-2.0
1
7.242
true
false
false
true
1.358254
0.532762
53.276196
0.438142
21.783608
0.028701
2.870091
0.276007
3.467562
0.397656
7.540365
0.276346
19.594046
false
false
2025-01-25
2025-01-25
2
mistralai/Mistral-7B-v0.1
tiiuae_Falcon3-10B-Base_bfloat16
bfloat16
🟢 pretrained
🟢
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/tiiuae/Falcon3-10B-Base" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">tiiuae/Falcon3-10B-Base</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/tiiuae__Falcon3-10B-Base-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
tiiuae/Falcon3-10B-Base
0b20cceec08ec598ed2de7a6dfbeb208f1eae656
27.617851
other
36
10.306
true
false
false
false
1.620779
0.364775
36.477546
0.595004
41.375462
0.249245
24.924471
0.345638
12.751678
0.439792
14.173958
0.424036
36.003989
false
true
2024-12-03
2024-12-12
0
tiiuae/Falcon3-10B-Base
tiiuae_Falcon3-10B-Instruct_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/tiiuae/Falcon3-10B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">tiiuae/Falcon3-10B-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/tiiuae__Falcon3-10B-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
tiiuae/Falcon3-10B-Instruct
9be8471432d7c4f35f72505fa2ca4101f0a2ed6d
35.475411
other
97
10.306
true
false
false
true
1.680822
0.781656
78.165601
0.617047
44.82154
0.276435
27.643505
0.328859
10.514541
0.432323
13.607031
0.442902
38.100251
false
true
2024-12-14
2024-12-16
1
tiiuae/Falcon3-10B-Base
tiiuae_Falcon3-1B-Base_bfloat16
bfloat16
🟢 pretrained
🟢
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/tiiuae/Falcon3-1B-Base" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">tiiuae/Falcon3-1B-Base</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/tiiuae__Falcon3-1B-Base-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
tiiuae/Falcon3-1B-Base
cc56a5a7c3923821312ad14f52c5a7c3fa835cbc
9.888096
other
23
1.669
true
false
false
false
0.802739
0.242801
24.280132
0.357115
11.343173
0.033233
3.323263
0.279362
3.914989
0.41474
9.709115
0.160821
6.757905
false
true
2024-12-13
2024-12-16
0
tiiuae/Falcon3-1B-Base
tiiuae_Falcon3-1B-Instruct_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/tiiuae/Falcon3-1B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">tiiuae/Falcon3-1B-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/tiiuae__Falcon3-1B-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
tiiuae/Falcon3-1B-Instruct
27dd70ccb22fd3cc71c5adbc95eb670455afff3d
16.164597
other
34
1.669
true
false
false
true
0.794041
0.555668
55.566785
0.374454
12.961374
0.063444
6.344411
0.266779
2.237136
0.418896
10.561979
0.183843
9.315898
false
true
2024-12-14
2024-12-16
1
tiiuae/Falcon3-1B-Base
tiiuae_Falcon3-3B-Base_bfloat16
bfloat16
🟢 pretrained
🟢
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/tiiuae/Falcon3-3B-Base" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">tiiuae/Falcon3-3B-Base</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/tiiuae__Falcon3-3B-Base-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
tiiuae/Falcon3-3B-Base
3d49753006a0fa5384031a737c60fbcd0f60b7f2
15.738743
other
16
3.228
true
false
false
false
0.962433
0.276499
27.649858
0.442137
21.584784
0.117825
11.782477
0.29698
6.263982
0.37499
6.273698
0.287899
20.87766
false
true
2024-12-13
2024-12-13
0
tiiuae/Falcon3-3B-Base
tiiuae_Falcon3-3B-Instruct_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/tiiuae/Falcon3-3B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">tiiuae/Falcon3-3B-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/tiiuae__Falcon3-3B-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
tiiuae/Falcon3-3B-Instruct
552213004cecf9bb6ce332f46da0d4324c8347f1
26.602345
other
25
3.228
true
false
false
true
0.960927
0.697676
69.76755
0.475443
26.287229
0.25
25
0.288591
5.145414
0.413594
11.132552
0.300532
22.281324
false
true
2024-12-14
2024-12-16
0
tiiuae/Falcon3-3B-Instruct
tiiuae_Falcon3-7B-Base_bfloat16
bfloat16
🟢 pretrained
🟢
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/tiiuae/Falcon3-7B-Base" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">tiiuae/Falcon3-7B-Base</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/tiiuae__Falcon3-7B-Base-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
tiiuae/Falcon3-7B-Base
a1cf49eb7a53210fc2ee82f3876bbc7efb2244fd
24.745725
other
26
7.456
true
false
false
false
1.218744
0.341595
34.159475
0.509888
31.559919
0.194109
19.410876
0.346477
12.863535
0.470208
18.142708
0.391041
32.33784
false
true
2024-11-21
2024-12-12
0
tiiuae/Falcon3-7B-Base
tiiuae_Falcon3-7B-Instruct_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/tiiuae/Falcon3-7B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">tiiuae/Falcon3-7B-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/tiiuae__Falcon3-7B-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
tiiuae/Falcon3-7B-Instruct
7aae4f3953f3dbfaa81aeecbb404a6bbba0e0c06
36.404685
other
64
7.456
true
false
false
true
1.237521
0.761248
76.124793
0.563244
37.915812
0.40861
40.861027
0.310403
8.053691
0.482677
21.167969
0.408743
34.304817
false
true
2024-11-29
2024-12-16
1
tiiuae/Falcon3-7B-Base
tiiuae_Falcon3-Mamba-7B-Base_bfloat16
bfloat16
🟢 pretrained
🟢
Original
FalconMambaForCausalLM
<a target="_blank" href="https://huggingface.co/tiiuae/Falcon3-Mamba-7B-Base" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">tiiuae/Falcon3-Mamba-7B-Base</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/tiiuae__Falcon3-Mamba-7B-Base-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
tiiuae/Falcon3-Mamba-7B-Base
f08d14145ce86c32dd04f18bacb3f12b247042e2
18.138792
other
21
7.273
true
false
false
false
1.672636
0.289113
28.911289
0.469928
25.534049
0.194109
19.410876
0.309564
7.941834
0.343146
4.393229
0.303773
22.641475
false
true
2024-12-11
2024-12-12
0
tiiuae/Falcon3-Mamba-7B-Base
tiiuae_Falcon3-Mamba-7B-Instruct_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
FalconMambaForCausalLM
<a target="_blank" href="https://huggingface.co/tiiuae/Falcon3-Mamba-7B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">tiiuae/Falcon3-Mamba-7B-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/tiiuae__Falcon3-Mamba-7B-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
tiiuae/Falcon3-Mamba-7B-Instruct
382561849d1509b5f1a4d7a38bb286b3c4f46fbd
28.109655
other
27
7.273
true
false
false
true
1.656995
0.71651
71.650997
0.467896
25.203505
0.300604
30.060423
0.303691
7.158837
0.386865
8.258073
0.336935
26.326093
false
true
2024-12-13
2024-12-13
1
tiiuae/Falcon3-Mamba-7B-Instruct (Merge)
tiiuae_falcon-11B_bfloat16
bfloat16
🟢 pretrained
🟢
Original
FalconForCausalLM
<a target="_blank" href="https://huggingface.co/tiiuae/falcon-11B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">tiiuae/falcon-11B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/tiiuae__falcon-11B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
tiiuae/falcon-11B
066e3bf4e2d9aaeefa129af0a6d39727d27816b3
13.851903
unknown
212
11.103
true
false
false
false
2.165742
0.326132
32.613244
0.439164
21.937999
0.027946
2.794562
0.270973
2.796421
0.398646
7.530729
0.238946
15.43846
false
true
2024-05-09
2024-06-09
0
tiiuae/falcon-11B
tiiuae_falcon-40b_bfloat16
bfloat16
🟢 pretrained
🟢
Original
FalconForCausalLM
<a target="_blank" href="https://huggingface.co/tiiuae/falcon-40b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">tiiuae/falcon-40b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/tiiuae__falcon-40b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
tiiuae/falcon-40b
4a70170c215b36a3cce4b4253f6d0612bb7d4146
11.401304
apache-2.0
2,422
40
true
false
false
false
43.587168
0.249645
24.964539
0.401853
16.583305
0.018127
1.812689
0.27349
3.131991
0.363146
5.193229
0.250499
16.722074
false
true
2023-05-24
2024-06-09
0
tiiuae/falcon-40b
tiiuae_falcon-40b-instruct_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
FalconForCausalLM
<a target="_blank" href="https://huggingface.co/tiiuae/falcon-40b-instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">tiiuae/falcon-40b-instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/tiiuae__falcon-40b-instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
tiiuae/falcon-40b-instruct
ecb78d97ac356d098e79f0db222c9ce7c5d9ee5f
10.484507
apache-2.0
1,176
40
true
false
false
false
39.466491
0.245449
24.544874
0.405387
17.220114
0.019637
1.963746
0.25
0
0.376229
5.161979
0.226147
14.016327
false
true
2023-05-25
2024-06-09
0
tiiuae/falcon-40b-instruct
tiiuae_falcon-7b_bfloat16
bfloat16
🟢 pretrained
🟢
Original
FalconForCausalLM
<a target="_blank" href="https://huggingface.co/tiiuae/falcon-7b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">tiiuae/falcon-7b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/tiiuae__falcon-7b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
tiiuae/falcon-7b
898df1396f35e447d5fe44e0a3ccaaaa69f30d36
5.173445
apache-2.0
1,083
7
true
false
false
false
1.571682
0.182051
18.20514
0.328524
5.963937
0.009819
0.981873
0.244966
0
0.377844
4.497135
0.112533
1.392583
false
true
2023-04-24
2024-06-09
0
tiiuae/falcon-7b
tiiuae_falcon-7b-instruct_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
FalconForCausalLM
<a target="_blank" href="https://huggingface.co/tiiuae/falcon-7b-instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">tiiuae/falcon-7b-instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/tiiuae__falcon-7b-instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
tiiuae/falcon-7b-instruct
cf4b3c42ce2fdfe24f753f0f0d179202fea59c99
5.116574
apache-2.0
950
7
true
false
false
false
1.532429
0.196889
19.68887
0.320342
4.823178
0.012085
1.208459
0.247483
0
0.363365
3.253906
0.115525
1.72503
false
true
2023-04-25
2024-06-09
0
tiiuae/falcon-7b-instruct
tiiuae_falcon-mamba-7b_bfloat16
bfloat16
🟢 pretrained
🟢
Original
FalconMambaForCausalLM
<a target="_blank" href="https://huggingface.co/tiiuae/falcon-mamba-7b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">tiiuae/falcon-mamba-7b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/tiiuae__falcon-mamba-7b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
tiiuae/falcon-mamba-7b
5337fd73f19847e111ba2291f3f0e1617b90c37d
15.179238
other
233
7
true
false
false
false
7.220816
0.333576
33.357602
0.428485
19.876878
0.044562
4.456193
0.310403
8.053691
0.421031
10.86224
0.230219
14.468824
false
true
2024-07-17
2024-07-23
0
tiiuae/falcon-mamba-7b
tinycompany_BiBo-v0.3_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/tinycompany/BiBo-v0.3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">tinycompany/BiBo-v0.3</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/tinycompany__BiBo-v0.3-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
tinycompany/BiBo-v0.3
82bb8be05e99b1ae1dbed7303975f6e638791ae2
19.543531
0
2.943
false
false
false
true
0.397796
0.518399
51.839896
0.464161
24.342662
0.087613
8.761329
0.267617
2.348993
0.39499
7.807031
0.299451
22.161274
false
false
2025-03-12
2025-03-13
0
tinycompany/BiBo-v0.3
tinycompany_BiBo-v0.7_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/tinycompany/BiBo-v0.7" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">tinycompany/BiBo-v0.7</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/tinycompany__BiBo-v0.7-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
tinycompany/BiBo-v0.7
a0be861c117d1a7b9f712fe700a167ae3b265235
15.965357
mit
0
2.943
true
false
false
true
0.790256
0.373818
37.381814
0.431082
19.674729
0.082326
8.232628
0.276846
3.579418
0.404417
8.585417
0.265043
18.338135
false
false
2025-03-12
2025-03-12
0
tinycompany/BiBo-v0.7
tinycompany_ShawtyIsBad-bgem3_bfloat16
bfloat16
🟢 pretrained
🟢
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/tinycompany/ShawtyIsBad-bgem3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">tinycompany/ShawtyIsBad-bgem3</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/tinycompany__ShawtyIsBad-bgem3-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
tinycompany/ShawtyIsBad-bgem3
2b5b5543215711ec51012c0c4ad4bf87b434f3ec
12.610398
apache-2.0
0
1.436
true
false
false
false
0.611049
0.260811
26.081131
0.385297
13.857812
0.048338
4.833837
0.305369
7.38255
0.369469
5.916927
0.258311
17.59013
false
false
2025-03-07
2025-03-08
0
tinycompany/ShawtyIsBad-bgem3
tinycompany_ShawtyIsBad-e5-large_bfloat16
bfloat16
🟢 pretrained
🟢
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/tinycompany/ShawtyIsBad-e5-large" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">tinycompany/ShawtyIsBad-e5-large</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/tinycompany__ShawtyIsBad-e5-large-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
tinycompany/ShawtyIsBad-e5-large
7dd62553c6f019786669ecd84bee87adf8eb3fb5
12.31634
mit
0
1.436
true
false
false
false
0.618452
0.246823
24.682287
0.387348
14.177224
0.045317
4.531722
0.302013
6.935123
0.372042
6.138542
0.256898
17.433141
false
false
2025-03-08
2025-03-08
0
tinycompany/ShawtyIsBad-e5-large
tinycompany_ShawtyIsBad-ib_bfloat16
bfloat16
🟢 pretrained
🟢
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/tinycompany/ShawtyIsBad-ib" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">tinycompany/ShawtyIsBad-ib</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/tinycompany__ShawtyIsBad-ib-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
tinycompany/ShawtyIsBad-ib
f7a222c25ba1f1c86c6aedb662738bf3edd15d32
12.35456
apache-2.0
0
1.436
true
false
false
false
0.610491
0.256515
25.651494
0.388046
14.23669
0.049094
4.909366
0.298658
6.487696
0.364104
5.279687
0.258062
17.562426
false
false
2025-03-07
2025-03-08
0
tinycompany/ShawtyIsBad-ib
tinycompany_ShawtyIsBad-nomic-moe_bfloat16
bfloat16
🟢 pretrained
🟢
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/tinycompany/ShawtyIsBad-nomic-moe" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">tinycompany/ShawtyIsBad-nomic-moe</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/tinycompany__ShawtyIsBad-nomic-moe-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
tinycompany/ShawtyIsBad-nomic-moe
366fb7526bf48bd32eb455f72903b3f3f179abb3
12.741407
mit
0
1.436
true
true
false
false
0.605777
0.260761
26.076145
0.387802
14.318943
0.043051
4.305136
0.307047
7.606264
0.374708
6.671875
0.257231
17.47008
false
false
2025-03-09
2025-03-09
0
tinycompany/ShawtyIsBad-nomic-moe
tinycompany_ShawtyIsBad-nomic1.5_bfloat16
bfloat16
🟢 pretrained
🟢
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/tinycompany/ShawtyIsBad-nomic1.5" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">tinycompany/ShawtyIsBad-nomic1.5</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/tinycompany__ShawtyIsBad-nomic1.5-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
tinycompany/ShawtyIsBad-nomic1.5
961a85e0f31f8711c3150b98e9052b8ee565ddba
12.504999
mit
0
1.436
true
false
false
false
0.610669
0.254392
25.439168
0.38736
14.217972
0.043051
4.305136
0.311242
8.165548
0.362833
5.4875
0.256732
17.414672
false
false
2025-03-08
2025-03-08
0
tinycompany/ShawtyIsBad-nomic1.5
tinycompany_SigmaBoi-base_bfloat16
bfloat16
🟢 pretrained
🟢
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/tinycompany/SigmaBoi-base" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">tinycompany/SigmaBoi-base</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/tinycompany__SigmaBoi-base-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
tinycompany/SigmaBoi-base
e3ce078082f17116b345712904539cb04935284a
15.250808
mit
0
2.943
true
false
false
false
0.761949
0.2447
24.469962
0.431436
20.769957
0.077795
7.779456
0.293624
5.816555
0.434271
12.483854
0.281666
20.185062
false
false
2025-03-09
2025-03-09
0
tinycompany/SigmaBoi-base
tinycompany_SigmaBoi-bge-m3_bfloat16
bfloat16
🟢 pretrained
🟢
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/tinycompany/SigmaBoi-bge-m3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">tinycompany/SigmaBoi-bge-m3</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/tinycompany__SigmaBoi-bge-m3-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
tinycompany/SigmaBoi-bge-m3
f7612761837b259d29d3a80bf4ae209c140ee413
15.455458
mit
0
2.943
true
false
false
false
0.765997
0.245024
24.502431
0.435092
21.206315
0.076284
7.628399
0.294463
5.928412
0.438302
13.254427
0.281915
20.212766
false
false
2025-03-08
2025-03-08
0
tinycompany/SigmaBoi-bge-m3
tinycompany_SigmaBoi-bgem3_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/tinycompany/SigmaBoi-bgem3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">tinycompany/SigmaBoi-bgem3</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/tinycompany__SigmaBoi-bgem3-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
tinycompany/SigmaBoi-bgem3
f7612761837b259d29d3a80bf4ae209c140ee413
15.455458
mit
0
2.943
true
false
false
false
0.775091
0.245024
24.502431
0.435092
21.206315
0.076284
7.628399
0.294463
5.928412
0.438302
13.254427
0.281915
20.212766
false
false
2025-03-08
2025-03-08
0
tinycompany/SigmaBoi-bgem3
tinycompany_SigmaBoi-ib_bfloat16
bfloat16
🟢 pretrained
🟢
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/tinycompany/SigmaBoi-ib" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">tinycompany/SigmaBoi-ib</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/tinycompany__SigmaBoi-ib-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
tinycompany/SigmaBoi-ib
90316c1235d0bfe04addc42a6b8e2995c443eed1
14.96821
mit
0
2.943
true
false
false
false
0.77167
0.247747
24.774709
0.434362
21.076579
0.074018
7.401813
0.287752
5.033557
0.428969
11.254427
0.282414
20.268174
false
false
2025-03-08
2025-03-08
0
tinycompany/SigmaBoi-ib
tinycompany_SigmaBoi-nomic-moe_bfloat16
bfloat16
🟢 pretrained
🟢
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/tinycompany/SigmaBoi-nomic-moe" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">tinycompany/SigmaBoi-nomic-moe</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/tinycompany__SigmaBoi-nomic-moe-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
tinycompany/SigmaBoi-nomic-moe
39ef773a558b70578273ab1bdf47427916da5b02
15.186432
mit
0
2.943
true
true
false
false
1.567581
0.247422
24.742239
0.433418
20.968642
0.071752
7.175227
0.292785
5.704698
0.431635
12.121094
0.28366
20.406693
false
false
2025-03-09
2025-03-09
0
tinycompany/SigmaBoi-nomic-moe
tinycompany_SigmaBoi-nomic1.5_bfloat16
bfloat16
🟢 pretrained
🟢
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/tinycompany/SigmaBoi-nomic1.5" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">tinycompany/SigmaBoi-nomic1.5</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/tinycompany__SigmaBoi-nomic1.5-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
tinycompany/SigmaBoi-nomic1.5
df78255741edd3dc3ec68131e28a7e95527a1279
15.473024
mit
0
2.943
true
false
false
false
0.784382
0.2447
24.469962
0.437053
21.437845
0.083082
8.308157
0.296141
6.152125
0.431604
12.017188
0.284076
20.452866
false
false
2025-03-08
2025-03-08
0
tinycompany/SigmaBoi-nomic1.5
tinycompany_SigmaBoi-nomic1.5-fp32_bfloat16
bfloat16
🟢 pretrained
🟢
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/tinycompany/SigmaBoi-nomic1.5-fp32" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">tinycompany/SigmaBoi-nomic1.5-fp32</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/tinycompany__SigmaBoi-nomic1.5-fp32-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
tinycompany/SigmaBoi-nomic1.5-fp32
18c9210fc34610f0bb37c59cdd69f3164856e46d
15.498419
mit
0
2.943
true
false
false
false
0.788203
0.246223
24.622335
0.437053
21.437845
0.083082
8.308157
0.296141
6.152125
0.431604
12.017188
0.284076
20.452866
false
false
2025-03-08
2025-03-08
0
tinycompany/SigmaBoi-nomic1.5-fp32
tinycompany_Tamed-Shawty_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/tinycompany/Tamed-Shawty" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">tinycompany/Tamed-Shawty</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/tinycompany__Tamed-Shawty-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
tinycompany/Tamed-Shawty
fed82f9f13b28c191a302006dbf1b392873a9fe7
13.533997
mit
0
1.562
true
false
false
true
0.614502
0.383086
38.308577
0.383706
14.653984
0.071752
7.175227
0.262584
1.677852
0.350094
1.595052
0.26014
17.793292
false
false
2025-03-10
2025-03-10
0
tinycompany/Tamed-Shawty
tklohj_WindyFloLLM_float16
float16
🟢 pretrained
🟢
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/tklohj/WindyFloLLM" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">tklohj/WindyFloLLM</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/tklohj__WindyFloLLM-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
tklohj/WindyFloLLM
21f4241ab3f091d1d309e9076a8d8e3f014908a8
14.243655
0
13.016
false
false
false
false
2.197024
0.266856
26.685639
0.463662
24.398763
0.015861
1.586103
0.275168
3.355705
0.425313
11.864063
0.258145
17.571661
false
false
2024-06-30
2024-07-10
1
tklohj/WindyFloLLM (Merge)
togethercomputer_GPT-JT-6B-v1_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
GPTJForCausalLM
<a target="_blank" href="https://huggingface.co/togethercomputer/GPT-JT-6B-v1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">togethercomputer/GPT-JT-6B-v1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/togethercomputer__GPT-JT-6B-v1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
togethercomputer/GPT-JT-6B-v1
f34aa35f906895602c1f86f5685e598afdea8051
6.877707
apache-2.0
300
6
true
false
false
false
75.917621
0.206106
20.610646
0.330266
7.318524
0.010574
1.057402
0.260906
1.454139
0.373656
3.873698
0.162566
6.951832
false
true
2022-11-24
2024-06-12
0
togethercomputer/GPT-JT-6B-v1
togethercomputer_GPT-NeoXT-Chat-Base-20B_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
GPTNeoXForCausalLM
<a target="_blank" href="https://huggingface.co/togethercomputer/GPT-NeoXT-Chat-Base-20B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">togethercomputer/GPT-NeoXT-Chat-Base-20B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/togethercomputer__GPT-NeoXT-Chat-Base-20B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
togethercomputer/GPT-NeoXT-Chat-Base-20B
d386708e84d862a65f7d2b4989f64750cb657227
5.140295
apache-2.0
696
20
true
false
false
false
5.967176
0.182976
18.297562
0.332097
6.830795
0.023414
2.34139
0.25
0
0.346063
1.757812
0.114528
1.614214
false
true
2023-03-03
2024-06-12
0
togethercomputer/GPT-NeoXT-Chat-Base-20B
togethercomputer_LLaMA-2-7B-32K_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/togethercomputer/LLaMA-2-7B-32K" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">togethercomputer/LLaMA-2-7B-32K</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/togethercomputer__LLaMA-2-7B-32K-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
togethercomputer/LLaMA-2-7B-32K
46c24bb5aef59722fa7aa6d75e832afd1d64b980
6.837716
llama2
538
7
true
false
false
false
1.169146
0.186497
18.649738
0.339952
8.089984
0.01435
1.435045
0.25
0
0.375365
4.320573
0.176779
8.530954
false
true
2023-07-26
2024-06-12
0
togethercomputer/LLaMA-2-7B-32K
togethercomputer_Llama-2-7B-32K-Instruct_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/togethercomputer/Llama-2-7B-32K-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">togethercomputer/Llama-2-7B-32K-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/togethercomputer__Llama-2-7B-32K-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
togethercomputer/Llama-2-7B-32K-Instruct
d27380af003252f5eb0d218e104938b4e673e3f3
8.258542
llama2
158
7
true
false
false
false
1.179819
0.213
21.300039
0.344347
8.56347
0.015861
1.586103
0.251678
0.223714
0.405594
9.199219
0.178108
8.678709
false
true
2023-08-08
2024-06-12
0
togethercomputer/Llama-2-7B-32K-Instruct
togethercomputer_RedPajama-INCITE-7B-Base_float16
float16
🟢 pretrained
🟢
Original
GPTNeoXForCausalLM
<a target="_blank" href="https://huggingface.co/togethercomputer/RedPajama-INCITE-7B-Base" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">togethercomputer/RedPajama-INCITE-7B-Base</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/togethercomputer__RedPajama-INCITE-7B-Base-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
togethercomputer/RedPajama-INCITE-7B-Base
78f7e482443971f4873ba3239f0ac810a367833b
5.561814
apache-2.0
93
7
true
false
false
false
2.441214
0.20823
20.822972
0.319489
5.087242
0.015861
1.586103
0.255034
0.671141
0.362
3.016667
0.119681
2.186761
false
true
2023-05-04
2024-06-12
0
togethercomputer/RedPajama-INCITE-7B-Base
togethercomputer_RedPajama-INCITE-7B-Chat_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
GPTNeoXForCausalLM
<a target="_blank" href="https://huggingface.co/togethercomputer/RedPajama-INCITE-7B-Chat" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">togethercomputer/RedPajama-INCITE-7B-Chat</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/togethercomputer__RedPajama-INCITE-7B-Chat-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
togethercomputer/RedPajama-INCITE-7B-Chat
47b94a739e2f3164b438501c8684acc5d5acc146
4.050901
apache-2.0
92
7
true
false
false
false
2.438672
0.155798
15.579773
0.317545
4.502174
0.006798
0.679758
0.252517
0.33557
0.34476
1.861719
0.112118
1.34641
false
true
2023-05-04
2024-06-13
0
togethercomputer/RedPajama-INCITE-7B-Chat
togethercomputer_RedPajama-INCITE-7B-Instruct_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
GPTNeoXForCausalLM
<a target="_blank" href="https://huggingface.co/togethercomputer/RedPajama-INCITE-7B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">togethercomputer/RedPajama-INCITE-7B-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/togethercomputer__RedPajama-INCITE-7B-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
togethercomputer/RedPajama-INCITE-7B-Instruct
7f36397b9985a3f981cdb618f8fec1c565ca5927
6.456725
apache-2.0
103
7
true
false
false
false
2.362237
0.205507
20.550694
0.337744
7.905416
0.021148
2.114804
0.250839
0.111857
0.36851
5.030469
0.127244
3.027113
false
true
2023-05-05
2024-06-12
0
togethercomputer/RedPajama-INCITE-7B-Instruct
togethercomputer_RedPajama-INCITE-Base-3B-v1_float16
float16
🟢 pretrained
🟢
Original
GPTNeoXForCausalLM
<a target="_blank" href="https://huggingface.co/togethercomputer/RedPajama-INCITE-Base-3B-v1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">togethercomputer/RedPajama-INCITE-Base-3B-v1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/togethercomputer__RedPajama-INCITE-Base-3B-v1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
togethercomputer/RedPajama-INCITE-Base-3B-v1
094fbdd0c911feb485ce55de1952ab2e75277e1e
5.52109
apache-2.0
90
3
true
false
false
false
1.552204
0.229363
22.936254
0.30604
3.518608
0.01435
1.435045
0.243289
0
0.373875
4.001042
0.11112
1.235594
false
true
2023-05-04
2024-06-12
0
togethercomputer/RedPajama-INCITE-Base-3B-v1
togethercomputer_RedPajama-INCITE-Chat-3B-v1_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
GPTNeoXForCausalLM
<a target="_blank" href="https://huggingface.co/togethercomputer/RedPajama-INCITE-Chat-3B-v1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">togethercomputer/RedPajama-INCITE-Chat-3B-v1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/togethercomputer__RedPajama-INCITE-Chat-3B-v1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
togethercomputer/RedPajama-INCITE-Chat-3B-v1
f0e0995eba801096ed04cb87931d96a8316871af
4.848824
apache-2.0
152
3
true
false
false
false
1.549818
0.165215
16.521496
0.321669
5.164728
0.009063
0.906344
0.244128
0
0.368448
5.089323
0.112699
1.411052
false
true
2023-05-05
2024-06-13
0
togethercomputer/RedPajama-INCITE-Chat-3B-v1
togethercomputer_RedPajama-INCITE-Instruct-3B-v1_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
GPTNeoXForCausalLM
<a target="_blank" href="https://huggingface.co/togethercomputer/RedPajama-INCITE-Instruct-3B-v1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">togethercomputer/RedPajama-INCITE-Instruct-3B-v1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/togethercomputer__RedPajama-INCITE-Instruct-3B-v1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
togethercomputer/RedPajama-INCITE-Instruct-3B-v1
0c66778ee09a036886741707733620b91057909a
5.777232
apache-2.0
93
3
true
false
false
false
1.521342
0.212426
21.242636
0.314602
4.510786
0.01284
1.283988
0.247483
0
0.388604
6.408854
0.110954
1.217125
false
true
2023-05-05
2024-06-12
0
togethercomputer/RedPajama-INCITE-Instruct-3B-v1
tokyotech-llm_Llama-3-Swallow-8B-Instruct-v0.1_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/tokyotech-llm/Llama-3-Swallow-8B-Instruct-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">tokyotech-llm/Llama-3-Swallow-8B-Instruct-v0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/tokyotech-llm__Llama-3-Swallow-8B-Instruct-v0.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
tokyotech-llm/Llama-3-Swallow-8B-Instruct-v0.1
1fae784584dd03680b72dd4de7eefbc5b7cabcd5
22.34515
llama3
19
8.03
true
false
false
true
1.71622
0.550772
55.077195
0.500939
29.267966
0.074773
7.477341
0.28943
5.257271
0.435698
13.795573
0.30876
23.195553
false
false
2024-06-26
2024-09-12
0
tokyotech-llm/Llama-3-Swallow-8B-Instruct-v0.1
tomasmcm_sky-t1-coder-32b-flash_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/tomasmcm/sky-t1-coder-32b-flash" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">tomasmcm/sky-t1-coder-32b-flash</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/tomasmcm__sky-t1-coder-32b-flash-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
tomasmcm/sky-t1-coder-32b-flash
d336471a461bb2093e85df3898aeb6db3ae0857f
44.868559
apache-2.0
0
32.764
true
false
false
true
50.047177
0.778009
77.800902
0.682244
55.465944
0.542296
54.229607
0.368289
15.771812
0.423271
12.808854
0.578208
53.134235
true
false
2025-02-22
2025-02-22
1
tomasmcm/sky-t1-coder-32b-flash (Merge)
trthminh1112_autotrain-llama32-1b-finetune_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/trthminh1112/autotrain-llama32-1b-finetune" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">trthminh1112/autotrain-llama32-1b-finetune</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/trthminh1112__autotrain-llama32-1b-finetune-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
trthminh1112/autotrain-llama32-1b-finetune
ceffe98f8b1f7e38628ad8b82536c43373472197
4.586088
other
0
1.1
true
false
false
false
0.174416
0.176855
17.685519
0.299563
2.852987
0.015106
1.510574
0.256711
0.894855
0.351271
3.475521
0.109874
1.097074
false
false
2024-11-20
2025-03-03
1
TinyLlama/TinyLlama-1.1B-intermediate-step-1431k-3T
tugstugi_Qwen2.5-7B-Instruct-QwQ-v0.1_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/tugstugi/Qwen2.5-7B-Instruct-QwQ-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">tugstugi/Qwen2.5-7B-Instruct-QwQ-v0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/tugstugi__Qwen2.5-7B-Instruct-QwQ-v0.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
tugstugi/Qwen2.5-7B-Instruct-QwQ-v0.1
65a426bd2c9519fd56c06bd3ecae8685d780bfe6
28.427901
apache-2.0
2
7.616
true
false
false
true
1.626438
0.60173
60.173008
0.510106
30.498894
0.38142
38.141994
0.268456
2.46085
0.379427
5.061719
0.408078
34.23094
false
false
2025-01-10
2025-01-19
1
tugstugi/Qwen2.5-7B-Instruct-QwQ-v0.1 (Merge)
universalml_NepaliGPT-2.0_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/universalml/NepaliGPT-2.0" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">universalml/NepaliGPT-2.0</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/universalml__NepaliGPT-2.0-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
universalml/NepaliGPT-2.0
cf7dfdac366e392d97a2092afd3d660719b027c4
12.586354
mit
2
8.03
true
false
false
false
0.420364
0.036495
3.649539
0.466048
24.21669
0.004532
0.453172
0.28104
4.138702
0.465677
17.509635
0.329953
25.550384
false
false
2024-08-19
2025-03-13
1
universalml/NepaliGPT-2.0 (Merge)
unsloth_Llama-3.2-1B-Instruct_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/unsloth/Llama-3.2-1B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">unsloth/Llama-3.2-1B-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/unsloth__Llama-3.2-1B-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
unsloth/Llama-3.2-1B-Instruct
eb49081324edb2ff14f848ce16393c067c6f4976
14.532451
llama3.2
67
1.236
true
false
false
true
0.722639
0.580997
58.099731
0.34847
8.31685
0.082326
8.232628
0.267617
2.348993
0.319615
1.951823
0.174202
8.244681
false
false
2024-09-25
2025-01-23
1
meta-llama/Llama-3.2-1B-Instruct
unsloth_Llama-3.2-1B-Instruct-no-system-message_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/unsloth/Llama-3.2-1B-Instruct-no-system-message" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">unsloth/Llama-3.2-1B-Instruct-no-system-message</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/unsloth__Llama-3.2-1B-Instruct-no-system-message-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
unsloth/Llama-3.2-1B-Instruct-no-system-message
99fb160e6da969b35bfc81cce4026e7c383f0bf8
14.363515
0
1.236
false
false
false
true
0.713874
0.564985
56.498535
0.354374
9.386372
0.075529
7.55287
0.272651
3.020134
0.334063
2.291146
0.166888
7.432033
false
false
2025-01-24
0
Removed
unsloth_Phi-3-mini-4k-instruct_float16
float16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/unsloth/Phi-3-mini-4k-instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">unsloth/Phi-3-mini-4k-instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/unsloth__Phi-3-mini-4k-instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
unsloth/Phi-3-mini-4k-instruct
636c707430a5509c80b1aa51d05c127ed339a975
27.34202
mit
43
3.821
true
false
false
true
0.939066
0.544028
54.402762
0.550024
36.732473
0.163897
16.389728
0.322987
9.731544
0.428417
13.11875
0.403092
33.676862
false
false
2024-04-29
2024-11-25
0
unsloth/Phi-3-mini-4k-instruct
unsloth_phi-4_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/unsloth/phi-4" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">unsloth/phi-4</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/unsloth__phi-4-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
unsloth/phi-4
682399cd249206f583fc19473d5a28af0a9bcea7
40.728304
mit
80
14.66
true
false
false
true
1.886539
0.688208
68.82084
0.688587
55.253145
0.5
50
0.336409
11.521253
0.411427
10.128385
0.537816
48.646203
false
false
2025-01-08
2025-01-09
1
microsoft/phi-4
unsloth_phi-4-bnb-4bit_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/unsloth/phi-4-bnb-4bit" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">unsloth/phi-4-bnb-4bit</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/unsloth__phi-4-bnb-4bit-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
unsloth/phi-4-bnb-4bit
85ca2925f3cc4f3c42de4168e9ba0695be5d5845
39.060495
mit
14
8.058
true
false
false
true
3.047743
0.672971
67.297105
0.676985
53.535199
0.460725
46.072508
0.338087
11.744966
0.400729
8.424479
0.525598
47.288712
false
false
2025-01-08
2025-01-09
1
microsoft/phi-4
unsloth_phi-4-unsloth-bnb-4bit_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/unsloth/phi-4-unsloth-bnb-4bit" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">unsloth/phi-4-unsloth-bnb-4bit</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/unsloth__phi-4-unsloth-bnb-4bit-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
unsloth/phi-4-unsloth-bnb-4bit
227e8cbc0de0cd783703a3a2f217159a86041a5f
39.216451
mit
49
8.483
true
false
false
true
3.037536
0.679391
67.939068
0.679109
53.840081
0.456193
45.619335
0.336409
11.521253
0.403396
8.757813
0.52859
47.621158
false
false
2025-01-08
2025-01-09
1
microsoft/phi-4
upstage_SOLAR-10.7B-Instruct-v1.0_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/upstage/SOLAR-10.7B-Instruct-v1.0" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">upstage/SOLAR-10.7B-Instruct-v1.0</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/upstage__SOLAR-10.7B-Instruct-v1.0-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
upstage/SOLAR-10.7B-Instruct-v1.0
c08c25ed66414a878fe0401a3596d536c083606c
20.572364
cc-by-nc-4.0
623
10.732
true
false
false
true
1.565552
0.473661
47.3661
0.516249
31.872402
0.056647
5.664653
0.308725
7.829978
0.389938
6.942188
0.31383
23.758865
false
true
2023-12-12
2024-06-12
1
upstage/SOLAR-10.7B-Instruct-v1.0 (Merge)
upstage_SOLAR-10.7B-v1.0_float16
float16
🟢 pretrained
🟢
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/upstage/SOLAR-10.7B-v1.0" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">upstage/SOLAR-10.7B-v1.0</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/upstage__SOLAR-10.7B-v1.0-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
upstage/SOLAR-10.7B-v1.0
a45090b8e56bdc2b8e32e46b3cd782fc0bea1fa5
16.854975
apache-2.0
302
10.732
true
false
false
false
2.166602
0.242126
24.212645
0.509387
29.789358
0.026435
2.643505
0.28104
4.138702
0.437156
13.677865
0.34001
26.667775
false
true
2023-12-12
2024-06-12
0
upstage/SOLAR-10.7B-v1.0
upstage_solar-pro-preview-instruct_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
SolarForCausalLM
<a target="_blank" href="https://huggingface.co/upstage/solar-pro-preview-instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">upstage/solar-pro-preview-instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/upstage__solar-pro-preview-instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
upstage/solar-pro-preview-instruct
b4db141b5fb08b23f8bc323bc34e2cff3e9675f8
39.938655
mit
445
22.14
true
false
false
true
3.483526
0.841581
84.158145
0.681684
54.822351
0.220544
22.054381
0.370805
16.107383
0.441656
15.007031
0.527344
47.482639
false
true
2024-09-09
2024-09-11
0
upstage/solar-pro-preview-instruct
utkmst_chimera-beta-test2-lora-merged_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/utkmst/chimera-beta-test2-lora-merged" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">utkmst/chimera-beta-test2-lora-merged</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/utkmst__chimera-beta-test2-lora-merged-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
utkmst/chimera-beta-test2-lora-merged
3218d26312a8dc187491c4e9f8633cea056b7223
22.389806
llama3.1
1
8.03
true
false
false
true
1.439803
0.605427
60.542693
0.479572
25.613164
0.095166
9.516616
0.303691
7.158837
0.411792
9.373958
0.299202
22.13357
false
false
2025-03-08
2025-03-08
2
meta-llama/Meta-Llama-3.1-8B
uukuguy_speechless-code-mistral-7b-v1.0_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/uukuguy/speechless-code-mistral-7b-v1.0" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">uukuguy/speechless-code-mistral-7b-v1.0</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/uukuguy__speechless-code-mistral-7b-v1.0-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
uukuguy/speechless-code-mistral-7b-v1.0
1862e0a712efc6002112e9c1235a197d58419b37
18.192592
apache-2.0
18
7
true
false
false
false
1.292797
0.366524
36.652416
0.457171
24.091412
0.052115
5.21148
0.284396
4.58613
0.450177
14.772135
0.314578
23.841977
false
false
2023-10-10
2024-06-26
0
uukuguy/speechless-code-mistral-7b-v1.0
uukuguy_speechless-codellama-34b-v2.0_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/uukuguy/speechless-codellama-34b-v2.0" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">uukuguy/speechless-codellama-34b-v2.0</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/uukuguy__speechless-codellama-34b-v2.0-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
uukuguy/speechless-codellama-34b-v2.0
419bc42a254102d6a5486a1a854068e912c4047c
17.209358
llama2
17
34
true
false
false
false
1.991254
0.460422
46.042168
0.481313
25.993293
0.043051
4.305136
0.269295
2.572707
0.378708
7.205208
0.254239
17.137633
false
false
2023-10-04
2024-06-26
0
uukuguy/speechless-codellama-34b-v2.0
uukuguy_speechless-coder-ds-6.7b_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/uukuguy/speechless-coder-ds-6.7b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">uukuguy/speechless-coder-ds-6.7b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/uukuguy__speechless-coder-ds-6.7b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
uukuguy/speechless-coder-ds-6.7b
c813a5268c6dfe267a720ad3b51773f1ab0feb59
9.714852
apache-2.0
6
6.7
true
false
false
false
1.577207
0.25047
25.046986
0.403637
15.897457
0.021148
2.114804
0.264262
1.901566
0.381938
5.342188
0.171875
7.986111
false
false
2023-12-30
2024-06-26
0
uukuguy/speechless-coder-ds-6.7b
uukuguy_speechless-instruct-mistral-7b-v0.2_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/uukuguy/speechless-instruct-mistral-7b-v0.2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">uukuguy/speechless-instruct-mistral-7b-v0.2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/uukuguy__speechless-instruct-mistral-7b-v0.2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
uukuguy/speechless-instruct-mistral-7b-v0.2
87a4d214f7d028d61c3dc013a7410b3c34a24072
18.106713
apache-2.0
0
7.242
true
false
false
false
1.235241
0.326132
32.613244
0.460667
24.558747
0.049094
4.909366
0.281879
4.250559
0.490177
21.172135
0.290226
21.136229
false
false
2024-05-22
2024-06-26
0
uukuguy/speechless-instruct-mistral-7b-v0.2
uukuguy_speechless-llama2-hermes-orca-platypus-wizardlm-13b_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/uukuguy/speechless-llama2-hermes-orca-platypus-wizardlm-13b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">uukuguy/speechless-llama2-hermes-orca-platypus-wizardlm-13b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/uukuguy__speechless-llama2-hermes-orca-platypus-wizardlm-13b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
uukuguy/speechless-llama2-hermes-orca-platypus-wizardlm-13b
954cc87b0ed5fa280126de546daf648861031512
18.701596
32
13.016
false
false
false
false
1.959049
0.456175
45.617517
0.484554
26.791727
0.020393
2.039275
0.270134
2.684564
0.4655
17.754167
0.255901
17.322326
false
false
2023-09-01
2024-06-26
0
uukuguy/speechless-llama2-hermes-orca-platypus-wizardlm-13b
uukuguy_speechless-mistral-dolphin-orca-platypus-samantha-7b_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/uukuguy/speechless-mistral-dolphin-orca-platypus-samantha-7b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">uukuguy/speechless-mistral-dolphin-orca-platypus-samantha-7b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/uukuguy__speechless-mistral-dolphin-orca-platypus-samantha-7b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
uukuguy/speechless-mistral-dolphin-orca-platypus-samantha-7b
b1de043468a15198b55a6509293a4ee585139043
18.340089
llama2
17
7.242
true
false
false
false
1.311437
0.370022
37.002154
0.498277
29.653129
0.029456
2.945619
0.283557
4.474273
0.436135
13.85026
0.299036
22.1151
false
false
2023-10-13
2024-06-26
0
uukuguy/speechless-mistral-dolphin-orca-platypus-samantha-7b
uukuguy_speechless-zephyr-code-functionary-7b_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/uukuguy/speechless-zephyr-code-functionary-7b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">uukuguy/speechless-zephyr-code-functionary-7b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/uukuguy__speechless-zephyr-code-functionary-7b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
uukuguy/speechless-zephyr-code-functionary-7b
d66fc775ece679966e352195c42444e9c70af7fa
16.460834
apache-2.0
2
7.242
true
false
false
false
1.267999
0.269579
26.957916
0.466428
25.983623
0.042296
4.229607
0.300336
6.711409
0.426771
11.613021
0.309425
23.26943
false
false
2024-01-23
2024-06-26
0
uukuguy/speechless-zephyr-code-functionary-7b
v000000_L3-8B-Stheno-v3.2-abliterated_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/v000000/L3-8B-Stheno-v3.2-abliterated" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">v000000/L3-8B-Stheno-v3.2-abliterated</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/v000000__L3-8B-Stheno-v3.2-abliterated-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
v000000/L3-8B-Stheno-v3.2-abliterated
ddb17f127a1c068b105b79aadd76632615743f68
24.620611
8
8.03
false
false
false
true
1.00071
0.671772
67.177201
0.514144
30.746305
0.069486
6.94864
0.309564
7.941834
0.361969
5.979427
0.360372
28.93026
false
false
2024-07-09
2025-01-07
1
v000000/L3-8B-Stheno-v3.2-abliterated (Merge)
v000000_L3.1-Niitorm-8B-DPO-t0.0001_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/v000000/L3.1-Niitorm-8B-DPO-t0.0001" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">v000000/L3.1-Niitorm-8B-DPO-t0.0001</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/v000000__L3.1-Niitorm-8B-DPO-t0.0001-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
v000000/L3.1-Niitorm-8B-DPO-t0.0001
a34150b5f63de4bc83d79b1de127faff3750289f
28.11323
8
8.03
false
false
false
true
1.756218
0.768867
76.886661
0.513423
30.513173
0.162387
16.238671
0.294463
5.928412
0.387979
7.264063
0.386636
31.848404
false
false
2024-09-19
2024-09-19
1
v000000/L3.1-Niitorm-8B-DPO-t0.0001 (Merge)
v000000_L3.1-Storniitova-8B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/v000000/L3.1-Storniitova-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">v000000/L3.1-Storniitova-8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/v000000__L3.1-Storniitova-8B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
v000000/L3.1-Storniitova-8B
05b126857f43d1b1383e50f8c97d214ceb199723
28.281707
7
8.03
false
false
false
true
1.62708
0.781656
78.165601
0.515145
30.810993
0.146526
14.652568
0.28943
5.257271
0.402896
9.961979
0.377576
30.841829
false
false
2024-09-12
2024-09-18
1
v000000/L3.1-Storniitova-8B (Merge)
v000000_Qwen2.5-14B-Gutenberg-1e-Delta_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/v000000/Qwen2.5-14B-Gutenberg-1e-Delta" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">v000000/Qwen2.5-14B-Gutenberg-1e-Delta</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/v000000__Qwen2.5-14B-Gutenberg-1e-Delta-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
v000000/Qwen2.5-14B-Gutenberg-1e-Delta
f624854b4380e01322e752ce4daadd49ac86580f
40.879014
apache-2.0
4
14.77
true
false
false
true
3.604773
0.804512
80.451203
0.63985
48.616672
0.526435
52.643505
0.328859
10.514541
0.407302
9.379427
0.493019
43.668735
false
false
2024-09-20
2024-09-28
1
v000000/Qwen2.5-14B-Gutenberg-1e-Delta (Merge)
v000000_Qwen2.5-14B-Gutenberg-Instruct-Slerpeno_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/v000000/Qwen2.5-14B-Gutenberg-Instruct-Slerpeno" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">v000000/Qwen2.5-14B-Gutenberg-Instruct-Slerpeno</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/v000000__Qwen2.5-14B-Gutenberg-Instruct-Slerpeno-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
v000000/Qwen2.5-14B-Gutenberg-Instruct-Slerpeno
1069abb4c25855e67ffaefa08a0befbb376e7ca7
41.362279
apache-2.0
6
14.77
true
false
false
true
5.34452
0.819749
81.974938
0.63901
48.452124
0.532477
53.247734
0.331376
10.850112
0.411365
10.053906
0.492354
43.594858
true
false
2024-09-20
2024-12-07
1
v000000/Qwen2.5-14B-Gutenberg-Instruct-Slerpeno (Merge)
v000000_Qwen2.5-Lumen-14B_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/v000000/Qwen2.5-Lumen-14B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">v000000/Qwen2.5-Lumen-14B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/v000000__Qwen2.5-Lumen-14B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
v000000/Qwen2.5-Lumen-14B
fbb1d184ed01dac52d307737893ebb6b0ace444c
41.137851
apache-2.0
19
14.77
true
false
false
true
3.673385
0.80636
80.636046
0.639081
48.507861
0.536254
53.625378
0.32802
10.402685
0.411396
10.291146
0.490276
43.363992
false
false
2024-09-20
2024-09-20
1
v000000/Qwen2.5-Lumen-14B (Merge)
vhab10_Llama-3.1-8B-Base-Instruct-SLERP_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/vhab10/Llama-3.1-8B-Base-Instruct-SLERP" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">vhab10/Llama-3.1-8B-Base-Instruct-SLERP</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/vhab10__Llama-3.1-8B-Base-Instruct-SLERP-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
vhab10/Llama-3.1-8B-Base-Instruct-SLERP
eccb4bde0dc91f586954109ecdce7c94f47e2625
19.274794
mit
1
8.03
true
false
false
false
1.613442
0.290712
29.071198
0.505744
29.926042
0.120091
12.009063
0.296141
6.152125
0.401063
9.366146
0.362118
29.124187
true
false
2024-09-16
2024-09-29
1
vhab10/Llama-3.1-8B-Base-Instruct-SLERP (Merge)
vhab10_Llama-3.2-Instruct-3B-TIES_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/vhab10/Llama-3.2-Instruct-3B-TIES" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">vhab10/Llama-3.2-Instruct-3B-TIES</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/vhab10__Llama-3.2-Instruct-3B-TIES-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
vhab10/Llama-3.2-Instruct-3B-TIES
0e8661730f40a6a279bd273cfe9fe46bbd0507dd
17.334326
mit
0
1.848
true
false
false
false
2.245853
0.472737
47.273678
0.433236
19.183159
0.098187
9.818731
0.269295
2.572707
0.349656
3.873698
0.291556
21.283983
true
false
2024-10-06
2024-11-23
1
vhab10/Llama-3.2-Instruct-3B-TIES (Merge)
vhab10_llama-3-8b-merged-linear_float16
float16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/vhab10/llama-3-8b-merged-linear" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">vhab10/llama-3-8b-merged-linear</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/vhab10__llama-3-8b-merged-linear-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
vhab10/llama-3-8b-merged-linear
c37e7671b5ccfadbf3065fa5b48af05cd4f13292
23.911368
mit
0
4.65
true
false
false
true
2.609887
0.591663
59.166345
0.493709
27.816051
0.081571
8.1571
0.299497
6.599553
0.419052
11.68151
0.370429
30.047651
false
false
2024-09-26
2024-09-26
1
vhab10/llama-3-8b-merged-linear (Merge)
vicgalle_CarbonBeagle-11B_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/vicgalle/CarbonBeagle-11B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">vicgalle/CarbonBeagle-11B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/vicgalle__CarbonBeagle-11B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
vicgalle/CarbonBeagle-11B
3fe9bf5327606d013b182fed17a472f5f043759b
22.470186
apache-2.0
9
10.732
true
false
false
true
1.830757
0.54153
54.152981
0.529365
33.060604
0.061934
6.193353
0.302013
6.935123
0.402031
9.18724
0.327626
25.291814
true
false
2024-01-21
2024-06-26
1
vicgalle/CarbonBeagle-11B (Merge)
vicgalle_CarbonBeagle-11B-truthy_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/vicgalle/CarbonBeagle-11B-truthy" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">vicgalle/CarbonBeagle-11B-truthy</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/vicgalle__CarbonBeagle-11B-truthy-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
vicgalle/CarbonBeagle-11B-truthy
476cd2a6d938bddb38dfbeb4cb21e3e34303413d
21.319963
apache-2.0
10
10.732
true
false
false
true
1.814547
0.521221
52.122147
0.534842
33.988376
0.049094
4.909366
0.299497
6.599553
0.373969
4.11276
0.335688
26.187574
false
false
2024-02-10
2024-07-13
0
vicgalle/CarbonBeagle-11B-truthy
vicgalle_Configurable-Hermes-2-Pro-Llama-3-8B_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/vicgalle/Configurable-Hermes-2-Pro-Llama-3-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">vicgalle/Configurable-Hermes-2-Pro-Llama-3-8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/vicgalle__Configurable-Hermes-2-Pro-Llama-3-8B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
vicgalle/Configurable-Hermes-2-Pro-Llama-3-8B
3cb5792509966a963645be24fdbeb2e7dc6cac15
22.565952
apache-2.0
6
8.031
true
false
false
true
1.497855
0.576251
57.625101
0.505484
30.509625
0.076284
7.628399
0.29698
6.263982
0.418365
10.06224
0.309757
23.306368
false
false
2024-05-02
2024-07-24
2
NousResearch/Meta-Llama-3-8B
vicgalle_Configurable-Llama-3.1-8B-Instruct_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/vicgalle/Configurable-Llama-3.1-8B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">vicgalle/Configurable-Llama-3.1-8B-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/vicgalle__Configurable-Llama-3.1-8B-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
vicgalle/Configurable-Llama-3.1-8B-Instruct
133b3ab1a5385ff9b3d17da2addfe3fc1fd6f733
28.010111
apache-2.0
16
8.03
true
false
false
true
1.593219
0.83124
83.124
0.504476
29.661398
0.172961
17.296073
0.274329
3.243848
0.384542
5.934375
0.359209
28.800975
false
false
2024-07-24
2024-08-05
0
vicgalle/Configurable-Llama-3.1-8B-Instruct
vicgalle_Configurable-Yi-1.5-9B-Chat_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/vicgalle/Configurable-Yi-1.5-9B-Chat" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">vicgalle/Configurable-Yi-1.5-9B-Chat</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/vicgalle__Configurable-Yi-1.5-9B-Chat-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
vicgalle/Configurable-Yi-1.5-9B-Chat
992cb2232caae78eff6a836b2e0642f7cbf6018e
26.162899
apache-2.0
2
8.829
true
false
false
true
1.883818
0.432345
43.234507
0.54522
35.334445
0.204683
20.468278
0.343121
12.416107
0.427115
12.022656
0.401513
33.501404
false
false
2024-05-12
2024-06-26
0
vicgalle/Configurable-Yi-1.5-9B-Chat
vicgalle_ConfigurableBeagle-11B_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/vicgalle/ConfigurableBeagle-11B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">vicgalle/ConfigurableBeagle-11B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/vicgalle__ConfigurableBeagle-11B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
vicgalle/ConfigurableBeagle-11B
bbc16dbf94b8e8a99bb3e2ada6755faf9c2990dd
22.622956
apache-2.0
3
10.732
true
false
false
true
1.759713
0.583445
58.344526
0.528659
32.392023
0.043051
4.305136
0.302013
6.935123
0.395302
7.379427
0.337434
26.381501
false
false
2024-02-17
2024-06-26
0
vicgalle/ConfigurableBeagle-11B
vicgalle_ConfigurableHermes-7B_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/vicgalle/ConfigurableHermes-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">vicgalle/ConfigurableHermes-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/vicgalle__ConfigurableHermes-7B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
vicgalle/ConfigurableHermes-7B
1333a88eaf6591836b2d9825d1eaec7260f336c9
19.536295
apache-2.0
3
7.242
true
false
false
true
1.234564
0.54108
54.107989
0.457297
23.158164
0.047583
4.758308
0.276846
3.579418
0.405688
9.110938
0.302527
22.502955
false
false
2024-02-17
2024-06-26
0
vicgalle/ConfigurableHermes-7B
vicgalle_ConfigurableSOLAR-10.7B_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/vicgalle/ConfigurableSOLAR-10.7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">vicgalle/ConfigurableSOLAR-10.7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/vicgalle__ConfigurableSOLAR-10.7B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
vicgalle/ConfigurableSOLAR-10.7B
9d9baad88ea9dbaa61881f15e4f0d16e931033b4
20.15345
apache-2.0
2
10.732
true
false
false
true
1.355363
0.509956
50.995581
0.486681
27.45095
0.066465
6.646526
0.298658
6.487696
0.380479
5.193229
0.31732
24.14672
false
false
2024-03-10
2024-06-26
0
vicgalle/ConfigurableSOLAR-10.7B
vicgalle_Humanish-RP-Llama-3.1-8B_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/vicgalle/Humanish-RP-Llama-3.1-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">vicgalle/Humanish-RP-Llama-3.1-8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/vicgalle__Humanish-RP-Llama-3.1-8B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
vicgalle/Humanish-RP-Llama-3.1-8B
d27aa731db1d390a8d17b0a4565c9231ee5ae8b9
25.423199
apache-2.0
10
8.03
true
false
false
true
1.506901
0.666926
66.692598
0.510039
29.95856
0.151813
15.181269
0.286913
4.9217
0.395208
8.267708
0.347656
27.517361
false
false
2024-08-03
2024-08-03
0
vicgalle/Humanish-RP-Llama-3.1-8B
vicgalle_Merge-Mistral-Prometheus-7B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/vicgalle/Merge-Mistral-Prometheus-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">vicgalle/Merge-Mistral-Prometheus-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/vicgalle__Merge-Mistral-Prometheus-7B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
vicgalle/Merge-Mistral-Prometheus-7B
a7083581b508ce83c74f9267f07024bd462e7161
16.586642
apache-2.0
1
7.242
true
false
false
true
1.260711
0.484801
48.480144
0.42014
18.410406
0.018127
1.812689
0.263423
1.789709
0.41
9.95
0.271692
19.076906
true
false
2024-05-04
2024-06-26
1
vicgalle/Merge-Mistral-Prometheus-7B (Merge)
vicgalle_Merge-Mixtral-Prometheus-8x7B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MixtralForCausalLM
<a target="_blank" href="https://huggingface.co/vicgalle/Merge-Mixtral-Prometheus-8x7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">vicgalle/Merge-Mixtral-Prometheus-8x7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/vicgalle__Merge-Mixtral-Prometheus-8x7B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
vicgalle/Merge-Mixtral-Prometheus-8x7B
ba53ee5b52a81e56b01e919c069a0d045cfd4e83
24.768982
apache-2.0
2
46.703
true
true
false
true
7.348018
0.574403
57.440259
0.53515
34.651421
0.0929
9.29003
0.308725
7.829978
0.40975
9.585417
0.368351
29.816785
true
false
2024-05-04
2024-06-26
1
vicgalle/Merge-Mixtral-Prometheus-8x7B (Merge)
vicgalle_Roleplay-Llama-3-8B_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/vicgalle/Roleplay-Llama-3-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">vicgalle/Roleplay-Llama-3-8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/vicgalle__Roleplay-Llama-3-8B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
vicgalle/Roleplay-Llama-3-8B
57297eb57dcc2c116f061d9dda341094203da01b
24.020183
apache-2.0
37
8.03
true
false
false
true
2.252317
0.732022
73.202215
0.501232
28.554604
0.09139
9.138973
0.260906
1.454139
0.352885
1.677344
0.370844
30.093824
false
false
2024-04-19
2024-06-26
0
vicgalle/Roleplay-Llama-3-8B