eval_name
stringlengths
12
111
Precision
stringclasses
3 values
Type
stringclasses
7 values
T
stringclasses
7 values
Weight type
stringclasses
3 values
Architecture
stringclasses
62 values
Model
stringlengths
355
689
fullname
stringlengths
4
102
Model sha
stringlengths
0
40
Average ⬆️
float64
0.74
52
Hub License
stringclasses
28 values
Hub ❤️
int64
0
5.99k
#Params (B)
float64
-1
141
Available on the hub
bool
2 classes
MoE
bool
2 classes
Flagged
bool
2 classes
Chat Template
bool
2 classes
CO₂ cost (kg)
float64
0.03
107
IFEval Raw
float64
0
0.9
IFEval
float64
0
90
BBH Raw
float64
0.24
0.75
BBH
float64
0.25
64.1
MATH Lvl 5 Raw
float64
0
0.52
MATH Lvl 5
float64
0
52.4
GPQA Raw
float64
0.21
0.47
GPQA
float64
0
29.4
MUSR Raw
float64
0.29
0.6
MUSR
float64
0
38.5
MMLU-PRO Raw
float64
0.1
0.73
MMLU-PRO
float64
0
70
Merged
bool
2 classes
Official Providers
bool
2 classes
Upload To Hub Date
stringclasses
480 values
Submission Date
stringclasses
220 values
Generation
int64
0
10
Base Model
stringlengths
4
102
Daemontatox_CogitoZ_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/Daemontatox/CogitoZ" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Daemontatox/CogitoZ</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Daemontatox__CogitoZ-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Daemontatox/CogitoZ
7079c4e915e6f549df9f1c3fa3a3260f9a835f48
38.363654
apache-2.0
0
32.764
true
false
false
true
4.431691
0.396724
39.672403
0.673449
53.889571
0.462991
46.299094
0.395134
19.35123
0.47926
19.940885
0.559259
51.028738
false
false
2025-01-03
2025-01-21
1
Daemontatox/CogitoZ (Merge)
Daemontatox_CogitoZ14_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/Daemontatox/CogitoZ14" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Daemontatox/CogitoZ14</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Daemontatox__CogitoZ14-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Daemontatox/CogitoZ14
df5320d7ff115f1e39e42506ed86a340eb2d12e0
31.63167
apache-2.0
1
14.77
true
false
false
true
3.489265
0.524169
52.41692
0.605901
43.096263
0.336103
33.610272
0.35151
13.534676
0.448781
14.83099
0.390708
32.300901
false
false
2025-01-03
2025-01-07
1
Daemontatox/CogitoZ14 (Merge)
Daemontatox_DocumentCogito_bfloat16
bfloat16
🌸 multimodal
🌸
Original
MllamaForConditionalGeneration
<a target="_blank" href="https://huggingface.co/Daemontatox/DocumentCogito" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Daemontatox/DocumentCogito</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Daemontatox__DocumentCogito-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Daemontatox/DocumentCogito
23dcfc6bf91d84db1c977b151fd0923270d3e3ef
24.207851
apache-2.0
1
10.67
true
false
false
false
0.706658
0.506434
50.643404
0.511156
29.793609
0.162387
16.238671
0.316275
8.836689
0.397313
8.597396
0.380236
31.137337
false
false
2025-01-16
2025-01-16
2
meta-llama/Llama-3.2-11B-Vision-Instruct
Daemontatox_Llama3.3-70B-CogniLink_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Daemontatox/Llama3.3-70B-CogniLink" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Daemontatox/Llama3.3-70B-CogniLink</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Daemontatox__Llama3.3-70B-CogniLink-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Daemontatox/Llama3.3-70B-CogniLink
69f134f69472a84d104d3ef0c0b1dd200b9a599d
42.472599
apache-2.0
1
70.554
true
false
false
true
16.189118
0.693104
69.31043
0.666833
52.124663
0.39577
39.577039
0.44547
26.06264
0.487698
21.395573
0.517287
46.365248
false
false
2025-01-10
2025-01-21
1
Daemontatox/Llama3.3-70B-CogniLink (Merge)
Daemontatox_MawaredT1_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/Daemontatox/MawaredT1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Daemontatox/MawaredT1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Daemontatox__MawaredT1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Daemontatox/MawaredT1
84a1d35d91b862a5cfc65988d4a0f65033b34c47
26.625558
apache-2.0
1
7.616
true
false
false
false
0.638479
0.41988
41.988036
0.521482
31.900788
0.14577
14.577039
0.334732
11.297539
0.470208
18.676042
0.471825
41.313904
false
false
2025-01-02
2025-01-02
2
arcee-ai/Meraj-Mini (Merge)
Daemontatox_Mini_QwQ_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/Daemontatox/Mini_QwQ" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Daemontatox/Mini_QwQ</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Daemontatox__Mini_QwQ-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Daemontatox/Mini_QwQ
e96df7ba6e989ee286da5d0b05a84525fdb56c53
28.541462
apache-2.0
1
7.616
true
false
false
false
0.658702
0.449706
44.970567
0.554899
36.210285
0.281722
28.172205
0.303691
7.158837
0.46825
17.264583
0.437251
37.472296
false
false
2025-01-16
2025-01-16
3
Qwen/Qwen2.5-7B
Daemontatox_NemoR_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/Daemontatox/NemoR" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Daemontatox/NemoR</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Daemontatox__NemoR-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Daemontatox/NemoR
688f1a4c3c69fe9c6440cad7919ab602ae61fa39
18.036234
0
6.124
false
false
false
false
1.130663
0.228738
22.873753
0.519407
31.60552
0.080816
8.081571
0.327181
10.290828
0.390802
9.916927
0.329039
25.448803
false
false
2024-12-31
0
Removed
Daemontatox_PathFinderAI2.0_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/Daemontatox/PathFinderAI2.0" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Daemontatox/PathFinderAI2.0</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Daemontatox__PathFinderAI2.0-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Daemontatox/PathFinderAI2.0
bf8cfd82d4ceceb133058a78e1fe48436b50568a
32.152926
apache-2.0
0
32.764
true
false
false
true
9.239331
0.454102
45.410178
0.665823
52.956513
0.261329
26.132931
0.302013
6.935123
0.421563
10.961979
0.554688
50.520833
false
false
2024-12-30
2025-01-21
4
Qwen/Qwen2.5-32B
Daemontatox_PathFinderAi3.0_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/Daemontatox/PathFinderAi3.0" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Daemontatox/PathFinderAi3.0</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Daemontatox__PathFinderAi3.0-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Daemontatox/PathFinderAi3.0
6c9aa17cee032523ce17de111d6865e33825cf1d
40.106227
apache-2.0
1
32.764
true
false
false
true
4.047362
0.427099
42.709899
0.688422
55.538355
0.483384
48.338369
0.408557
21.14094
0.480688
20.052604
0.575715
52.857196
false
false
2024-12-31
2025-01-21
1
Daemontatox/PathFinderAI3.0
Daemontatox_PathfinderAI_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/Daemontatox/PathfinderAI" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Daemontatox/PathfinderAI</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Daemontatox__PathfinderAI-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Daemontatox/PathfinderAI
14c6a91351006b7be0aff85292733470ff1b546d
38.131314
apache-2.0
0
32.764
true
false
false
false
4.540918
0.374517
37.451739
0.666785
52.646547
0.475831
47.583082
0.394295
19.239374
0.485833
20.829167
0.559342
51.037973
false
false
2024-12-24
2024-12-25
1
Daemontatox/PathfinderAI (Merge)
Daemontatox_PathfinderAI_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/Daemontatox/PathfinderAI" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Daemontatox/PathfinderAI</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Daemontatox__PathfinderAI-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Daemontatox/PathfinderAI
7271fc7d08fca9b12c49b40af6245a982273a5c3
33.112212
apache-2.0
0
32.764
true
false
false
true
4.72572
0.485501
48.550069
0.662734
52.322163
0.277946
27.794562
0.309564
7.941834
0.425594
11.599219
0.554189
50.465426
false
false
2024-12-24
2024-12-30
1
Daemontatox/PathfinderAI (Merge)
Daemontatox_Phi-4-COT_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Daemontatox/Phi-4-COT" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Daemontatox/Phi-4-COT</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Daemontatox__Phi-4-COT-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Daemontatox/Phi-4-COT
bfc745d1a347b74843671eb50687c2e88c07ec7d
25.44906
apache-2.0
0
14.66
true
false
false
false
0.857577
0.179303
17.930314
0.617293
45.34299
0.183535
18.353474
0.33557
11.409396
0.453
15.158333
0.500499
44.499852
false
false
2025-01-11
2025-01-11
2
microsoft/phi-4
Daemontatox_PixelParse_AI_bfloat16
bfloat16
🌸 multimodal
🌸
Original
MllamaForConditionalGeneration
<a target="_blank" href="https://huggingface.co/Daemontatox/PixelParse_AI" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Daemontatox/PixelParse_AI</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Daemontatox__PixelParse_AI-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Daemontatox/PixelParse_AI
cc94604b91fc38513ca61f11dd9e1de1c3cc3b3d
22.874708
apache-2.0
0
10.67
true
false
false
false
0.70011
0.43829
43.82904
0.503431
29.031857
0.14426
14.425982
0.323826
9.8434
0.405188
9.248438
0.377826
30.869533
false
false
2024-12-27
2024-12-29
2
meta-llama/Llama-3.2-11B-Vision-Instruct
Daemontatox_RA2.0_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/Daemontatox/RA2.0" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Daemontatox/RA2.0</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Daemontatox__RA2.0-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Daemontatox/RA2.0
e1505dd5f9f2c8549cc852a1aca3ec545638e813
22.489864
apache-2.0
0
7.616
true
false
false
false
0.663688
0.378389
37.838934
0.488869
28.471838
0.339124
33.912387
0.305369
7.38255
0.409125
9.373958
0.261636
17.959515
false
false
2025-01-01
2025-01-01
1
Qwen/Qwen2-Math-7B-Instruct
Daemontatox_RA_Reasoner_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Daemontatox/RA_Reasoner" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Daemontatox/RA_Reasoner</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Daemontatox__RA_Reasoner-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Daemontatox/RA_Reasoner
e799c6877cb70b6e78c1e337eaa58383040c8fa9
29.019181
apache-2.0
1
10.306
true
false
false
false
0.779073
0.559215
55.92151
0.605369
43.073008
0.200906
20.090634
0.331376
10.850112
0.396354
7.510938
0.43002
36.668883
false
false
2024-12-20
2024-12-25
2
tiiuae/Falcon3-10B-Base
Daemontatox_RA_Reasoner2.0_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Daemontatox/RA_Reasoner2.0" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Daemontatox/RA_Reasoner2.0</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Daemontatox__RA_Reasoner2.0-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Daemontatox/RA_Reasoner2.0
2a7477f34b171d2ae090e57abdbd997546dee242
29.001903
apache-2.0
0
10.306
true
false
false
false
0.786756
0.536634
53.663391
0.606247
43.070069
0.228852
22.885196
0.324664
9.955257
0.388354
7.177604
0.435339
37.2599
false
false
2024-12-29
2024-12-29
3
tiiuae/Falcon3-10B-Base
Daemontatox_ReasonTest_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/Daemontatox/ReasonTest" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Daemontatox/ReasonTest</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Daemontatox__ReasonTest-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Daemontatox/ReasonTest
8e81cfddd97a13d81d6207eb72be8b730a7ca12f
24.901536
0
3.808
false
false
false
false
0.670315
0.407965
40.796531
0.543526
35.375037
0.156344
15.634441
0.318792
9.17226
0.431542
12.076042
0.427194
36.354905
false
false
2024-12-31
0
Removed
Daemontatox_SphinX_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/Daemontatox/SphinX" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Daemontatox/SphinX</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Daemontatox__SphinX-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Daemontatox/SphinX
3da400d648b198211c81f61421bdcefac8073506
28.829966
apache-2.0
2
7.616
true
false
false
false
0.652158
0.572504
57.250429
0.544058
34.712451
0.245468
24.546828
0.297819
6.375839
0.4405
12.695833
0.436586
37.398419
false
false
2024-12-21
2024-12-31
1
Daemontatox/SphinX (Merge)
Daemontatox_Sphinx2.0_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/Daemontatox/Sphinx2.0" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Daemontatox/Sphinx2.0</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Daemontatox__Sphinx2.0-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Daemontatox/Sphinx2.0
16abdfe2c214dc1da6bfe654b3d6716fcc8450e2
31.45048
apache-2.0
0
14.77
true
false
false
true
1.796325
0.712313
71.231333
0.647284
49.396752
0.02719
2.719033
0.293624
5.816555
0.426031
13.053906
0.518368
46.485298
false
false
2024-12-30
2024-12-30
1
Daemontatox/Sphinx2.0 (Merge)
Daemontatox_TinySphinx_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/Daemontatox/TinySphinx" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Daemontatox/TinySphinx</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Daemontatox__TinySphinx-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Daemontatox/TinySphinx
62172ccb670864070581498fb12e7d2594ac3a77
7.449644
0
0.247
false
false
false
false
0.503628
0.25669
25.669003
0.330984
6.546576
0
0
0.27349
3.131991
0.33276
1.595052
0.169797
7.755245
false
false
2024-12-31
0
Removed
Daemontatox_TinySphinx2.0_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/Daemontatox/TinySphinx2.0" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Daemontatox/TinySphinx2.0</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Daemontatox__TinySphinx2.0-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Daemontatox/TinySphinx2.0
accc28aa00084fe89801baa0885c291d18a031ec
7.483222
0
0.247
false
false
false
false
0.502086
0.253517
25.351733
0.316841
5.004029
0.026435
2.643505
0.268456
2.46085
0.33825
1.314583
0.173122
8.124631
false
false
2024-12-31
0
Removed
Daemontatox_mini_Pathfinder_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/Daemontatox/mini_Pathfinder" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Daemontatox/mini_Pathfinder</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Daemontatox__mini_Pathfinder-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Daemontatox/mini_Pathfinder
20d12c01e831675a563c978900bcf291def5f7dd
15.605224
apache-2.0
0
7.616
true
false
false
true
0.774131
0.296158
29.615753
0.395569
16.030028
0.219033
21.903323
0.258389
1.118568
0.378094
4.861719
0.280918
20.10195
false
false
2025-01-20
2025-01-20
2
deepseek-ai/DeepSeek-R1-Distill-Qwen-7B
Dampfinchen_Llama-3.1-8B-Ultra-Instruct_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Dampfinchen/Llama-3.1-8B-Ultra-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Dampfinchen/Llama-3.1-8B-Ultra-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Dampfinchen__Llama-3.1-8B-Ultra-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Dampfinchen/Llama-3.1-8B-Ultra-Instruct
46662d14130cfd34f7d90816540794f24a301f86
29.127051
llama3
6
8.03
true
false
false
true
0.836479
0.808109
80.810915
0.525753
32.494587
0.15861
15.861027
0.291946
5.592841
0.400323
8.607031
0.382563
31.395907
true
false
2024-08-26
2024-08-26
1
Dampfinchen/Llama-3.1-8B-Ultra-Instruct (Merge)
Danielbrdz_Barcenas-10b_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Danielbrdz/Barcenas-10b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Danielbrdz/Barcenas-10b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Danielbrdz__Barcenas-10b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Danielbrdz/Barcenas-10b
71884e96b88f6c86fca3a528ddf71c7745cb1d76
31.644384
apache-2.0
1
10.306
true
false
false
false
0.809655
0.660781
66.078117
0.612083
43.769695
0.201662
20.166163
0.341443
12.192394
0.413469
10.316927
0.436087
37.343011
false
false
2025-01-04
2025-01-06
1
Danielbrdz/Barcenas-10b (Merge)
Danielbrdz_Barcenas-14b-Phi-3-medium-ORPO_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/Danielbrdz/Barcenas-14b-Phi-3-medium-ORPO" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Danielbrdz/Barcenas-14b-Phi-3-medium-ORPO</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Danielbrdz__Barcenas-14b-Phi-3-medium-ORPO-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Danielbrdz/Barcenas-14b-Phi-3-medium-ORPO
b749dbcb19901b8fd0e9f38c923a24533569f895
31.738448
mit
5
13.96
true
false
false
true
1.572315
0.479906
47.990554
0.653618
51.029418
0.193353
19.335347
0.326342
10.178971
0.48075
20.527083
0.472324
41.369311
false
false
2024-06-15
2024-08-13
0
Danielbrdz/Barcenas-14b-Phi-3-medium-ORPO
Danielbrdz_Barcenas-14b-phi-4_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Danielbrdz/Barcenas-14b-phi-4" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Danielbrdz/Barcenas-14b-phi-4</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Danielbrdz__Barcenas-14b-phi-4-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Danielbrdz/Barcenas-14b-phi-4
53891d973087e8909e1c9cc968b7bf222247e2ab
28.695704
mit
1
14.66
true
false
false
false
0.873927
0.049759
4.975908
0.67693
53.257692
0.255287
25.528701
0.383389
17.785235
0.509677
24.242969
0.517453
46.383717
false
false
2025-01-19
2025-01-26
1
Danielbrdz/Barcenas-14b-phi-4 (Merge)
Danielbrdz_Barcenas-Llama3-8b-ORPO_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Danielbrdz/Barcenas-Llama3-8b-ORPO" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Danielbrdz/Barcenas-Llama3-8b-ORPO</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Danielbrdz__Barcenas-Llama3-8b-ORPO-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Danielbrdz/Barcenas-Llama3-8b-ORPO
66c848c4526d3db1ec41468c0f73ac4448c6abe9
26.519005
other
7
8.03
true
false
false
true
0.774159
0.737243
73.724274
0.498656
28.600623
0.06571
6.570997
0.307047
7.606264
0.418958
11.169792
0.382979
31.44208
false
false
2024-04-29
2024-06-29
0
Danielbrdz/Barcenas-Llama3-8b-ORPO
Danielbrdz_Barcenas-R1-Qwen-1.5b_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/Danielbrdz/Barcenas-R1-Qwen-1.5b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Danielbrdz/Barcenas-R1-Qwen-1.5b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Danielbrdz__Barcenas-R1-Qwen-1.5b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Danielbrdz/Barcenas-R1-Qwen-1.5b
10e2f6bd3bb254f7e4e6857ab2799aaa9c855876
13.741578
mit
0
1.777
true
false
false
false
0.611253
0.242801
24.280132
0.35872
10.49126
0.265861
26.586103
0.303691
7.158837
0.354125
3.832292
0.190908
10.100842
false
false
2025-01-26
2025-01-26
1
Danielbrdz/Barcenas-R1-Qwen-1.5b (Merge)
Dans-DiscountModels_Dans-Instruct-CoreCurriculum-12b-ChatML_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/Dans-DiscountModels/Dans-Instruct-CoreCurriculum-12b-ChatML" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Dans-DiscountModels/Dans-Instruct-CoreCurriculum-12b-ChatML</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Dans-DiscountModels__Dans-Instruct-CoreCurriculum-12b-ChatML-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Dans-DiscountModels/Dans-Instruct-CoreCurriculum-12b-ChatML
56925fafe6a543e224db36864dd0927171542776
12.913452
apache-2.0
0
12.248
true
false
false
false
3.234644
0.211102
21.11021
0.479186
26.046417
0.005287
0.528701
0.280201
4.026846
0.360635
5.71276
0.280502
20.055777
false
false
2024-09-04
2024-09-04
1
mistralai/Mistral-Nemo-Base-2407
Dans-DiscountModels_Dans-Instruct-Mix-8b-ChatML_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Dans-DiscountModels/Dans-Instruct-Mix-8b-ChatML" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Dans-DiscountModels/Dans-Instruct-Mix-8b-ChatML</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Dans-DiscountModels__Dans-Instruct-Mix-8b-ChatML-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Dans-DiscountModels/Dans-Instruct-Mix-8b-ChatML
029d84d4f4a618aa798490c046753b12801158e2
13.49618
0
8.03
false
false
false
false
0.798569
0.082508
8.250775
0.473817
26.336394
0.053625
5.362538
0.294463
5.928412
0.391823
9.677865
0.32879
25.421099
false
false
2024-09-14
0
Removed
Dans-DiscountModels_Dans-Instruct-Mix-8b-ChatML-V0.1.0_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Dans-DiscountModels/Dans-Instruct-Mix-8b-ChatML-V0.1.0" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Dans-DiscountModels/Dans-Instruct-Mix-8b-ChatML-V0.1.0</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Dans-DiscountModels__Dans-Instruct-Mix-8b-ChatML-V0.1.0-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Dans-DiscountModels/Dans-Instruct-Mix-8b-ChatML-V0.1.0
9367c1273b0025793531fcf3a2c15416539f5d81
12.974202
0
8.03
false
false
false
false
0.814699
0.06682
6.682048
0.477477
26.737652
0.061178
6.117825
0.286074
4.809843
0.378583
8.122917
0.328374
25.374926
false
false
2024-09-20
0
Removed
Dans-DiscountModels_Dans-Instruct-Mix-8b-ChatML-V0.1.1_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Dans-DiscountModels/Dans-Instruct-Mix-8b-ChatML-V0.1.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Dans-DiscountModels/Dans-Instruct-Mix-8b-ChatML-V0.1.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Dans-DiscountModels__Dans-Instruct-Mix-8b-ChatML-V0.1.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Dans-DiscountModels/Dans-Instruct-Mix-8b-ChatML-V0.1.1
a6188cd1807d0d72e55adc371ddd198d7e9aa7ae
13.311583
0
8.03
false
false
false
false
0.790589
0.091051
9.105063
0.474865
26.412551
0.057402
5.740181
0.291107
5.480984
0.38249
7.811198
0.327876
25.319518
false
false
2024-09-23
0
Removed
Dans-DiscountModels_Dans-Instruct-Mix-8b-ChatML-V0.2.0_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Dans-DiscountModels/Dans-Instruct-Mix-8b-ChatML-V0.2.0" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Dans-DiscountModels/Dans-Instruct-Mix-8b-ChatML-V0.2.0</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Dans-DiscountModels__Dans-Instruct-Mix-8b-ChatML-V0.2.0-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Dans-DiscountModels/Dans-Instruct-Mix-8b-ChatML-V0.2.0
15a9988381fdba15281f1bd6b04c34f3f96120cc
18.590919
0
8.03
false
false
false
true
0.843717
0.506409
50.640855
0.462426
24.734771
0.043807
4.380665
0.293624
5.816555
0.364448
3.75599
0.29995
22.216681
false
false
2024-09-30
0
Removed
Dans-DiscountModels_Mistral-7b-v0.3-Test-E0.7_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/Dans-DiscountModels/Mistral-7b-v0.3-Test-E0.7" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Dans-DiscountModels/Mistral-7b-v0.3-Test-E0.7</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Dans-DiscountModels__Mistral-7b-v0.3-Test-E0.7-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Dans-DiscountModels/Mistral-7b-v0.3-Test-E0.7
e91ad0ada3f0d906bacd3c0ad41da4f65ce77b08
19.144688
0
7
false
false
false
true
0.437886
0.512354
51.235389
0.475022
26.820762
0.032477
3.247734
0.296141
6.152125
0.40051
8.030469
0.274435
19.381649
false
false
2024-11-15
0
Removed
Dans-DiscountModels_mistral-7b-test-merged_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/Dans-DiscountModels/mistral-7b-test-merged" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Dans-DiscountModels/mistral-7b-test-merged</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Dans-DiscountModels__mistral-7b-test-merged-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Dans-DiscountModels/mistral-7b-test-merged
9db677cc43fb88852d952ef5914e919e65dd03eb
22.224397
apache-2.0
0
7
true
false
false
true
1.89713
0.6678
66.780033
0.489817
28.941005
0.053625
5.362538
0.294463
5.928412
0.375396
4.357813
0.297789
21.976581
false
false
2024-11-27
2024-11-30
1
Dans-DiscountModels/mistral-7b-test-merged (Merge)
Darkknight535_OpenCrystal-12B-L3_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Darkknight535/OpenCrystal-12B-L3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Darkknight535/OpenCrystal-12B-L3</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Darkknight535__OpenCrystal-12B-L3-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Darkknight535/OpenCrystal-12B-L3
974d2d453afdde40f6a993601bbbbf9d97b43606
20.672888
14
11.52
false
false
false
false
2.012285
0.407091
40.709096
0.52226
31.844491
0.089124
8.912387
0.306208
7.494407
0.365656
5.740365
0.364029
29.336584
false
false
2024-08-25
2024-08-26
0
Darkknight535/OpenCrystal-12B-L3
DavidAU_Gemma-The-Writer-9B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/DavidAU/Gemma-The-Writer-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">DavidAU/Gemma-The-Writer-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/DavidAU__Gemma-The-Writer-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
DavidAU/Gemma-The-Writer-9B
fcd6c9a1d0f6acc5bffc7df72cd8e996a9573937
19.110974
3
10.159
false
false
false
true
1.984018
0.174032
17.403157
0.590544
41.272319
0
0
0.345638
12.751678
0.409875
10.134375
0.397939
33.104314
false
false
2024-09-26
2025-01-11
1
DavidAU/Gemma-The-Writer-9B (Merge)
DavidAU_Gemma-The-Writer-DEADLINE-10B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/DavidAU/Gemma-The-Writer-DEADLINE-10B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">DavidAU/Gemma-The-Writer-DEADLINE-10B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/DavidAU__Gemma-The-Writer-DEADLINE-10B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
DavidAU/Gemma-The-Writer-DEADLINE-10B
69f38a595090ce6ba154b21d9d8b4c690f02b74e
20.304536
0
10.952
false
false
false
true
2.598795
0.233158
23.315802
0.589609
41.019199
0.016616
1.661631
0.342282
12.304251
0.418865
10.791406
0.394614
32.734929
false
false
2024-10-27
2025-01-11
1
DavidAU/Gemma-The-Writer-DEADLINE-10B (Merge)
DavidAU_Gemma-The-Writer-J.GutenBerg-10B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/DavidAU/Gemma-The-Writer-J.GutenBerg-10B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">DavidAU/Gemma-The-Writer-J.GutenBerg-10B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/DavidAU__Gemma-The-Writer-J.GutenBerg-10B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
DavidAU/Gemma-The-Writer-J.GutenBerg-10B
7318b14104e3eb06c8e571ec8a51c7f027834d74
21.444398
0
10.034
false
false
false
true
2.518673
0.285789
28.578948
0.590942
41.155991
0.037764
3.776435
0.338087
11.744966
0.417594
10.665885
0.394697
32.744164
false
false
2024-10-30
2025-01-11
1
DavidAU/Gemma-The-Writer-J.GutenBerg-10B (Merge)
DavidAU_Gemma-The-Writer-Mighty-Sword-9B_float16
float16
🤝 base merges and moerges
🤝
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/DavidAU/Gemma-The-Writer-Mighty-Sword-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">DavidAU/Gemma-The-Writer-Mighty-Sword-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/DavidAU__Gemma-The-Writer-Mighty-Sword-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
DavidAU/Gemma-The-Writer-Mighty-Sword-9B
39e655b61e11cd9a53529c6bdf0e6357b5be6b2c
28.84903
2
10.159
false
false
false
true
1.421491
0.752755
75.275491
0.591196
41.39261
0
0
0.348154
13.087248
0.411177
10.363802
0.396775
32.97503
false
false
2024-12-25
2025-01-11
1
DavidAU/Gemma-The-Writer-Mighty-Sword-9B (Merge)
DavidAU_Gemma-The-Writer-N-Restless-Quill-10B-Uncensored_float16
float16
🤝 base merges and moerges
🤝
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/DavidAU/Gemma-The-Writer-N-Restless-Quill-10B-Uncensored" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">DavidAU/Gemma-The-Writer-N-Restless-Quill-10B-Uncensored</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/DavidAU__Gemma-The-Writer-N-Restless-Quill-10B-Uncensored-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
DavidAU/Gemma-The-Writer-N-Restless-Quill-10B-Uncensored
1138d6b3e3527b75e7331044b1f0589a90667e8d
27.865146
3
10.034
false
false
false
true
1.746699
0.707093
70.709274
0.592229
40.850091
0.000755
0.075529
0.341443
12.192394
0.416323
10.407031
0.396609
32.95656
false
false
2024-10-30
2025-01-11
1
DavidAU/Gemma-The-Writer-N-Restless-Quill-10B-Uncensored (Merge)
DavidAU_L3-DARKEST-PLANET-16.5B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/DavidAU/L3-DARKEST-PLANET-16.5B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">DavidAU/L3-DARKEST-PLANET-16.5B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/DavidAU__L3-DARKEST-PLANET-16.5B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
DavidAU/L3-DARKEST-PLANET-16.5B
37545fbc229061956c1801968c33c5b187512c41
24.290232
4
16.537
false
false
false
true
2.11252
0.623062
62.306236
0.523044
31.776241
0.09139
9.138973
0.295302
6.040268
0.375365
7.253906
0.363032
29.225768
false
false
2024-10-11
2025-01-11
1
DavidAU/L3-DARKEST-PLANET-16.5B (Merge)
DavidAU_L3-Dark-Planet-8B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/DavidAU/L3-Dark-Planet-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">DavidAU/L3-Dark-Planet-8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/DavidAU__L3-Dark-Planet-8B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
DavidAU/L3-Dark-Planet-8B
462c9307ba4cfcb0c1edcceac5e06f4007bc803e
20.519537
6
8.03
false
false
false
false
0.939141
0.413411
41.341086
0.508408
29.789627
0.085347
8.534743
0.300336
6.711409
0.361594
6.332552
0.37367
30.407801
false
false
2024-09-05
2024-09-12
1
DavidAU/L3-Dark-Planet-8B (Merge)
DavidAU_L3-Jamet-12.2B-MK.V-Blackroot-Instruct_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/DavidAU/L3-Jamet-12.2B-MK.V-Blackroot-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">DavidAU/L3-Jamet-12.2B-MK.V-Blackroot-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/DavidAU__L3-Jamet-12.2B-MK.V-Blackroot-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
DavidAU/L3-Jamet-12.2B-MK.V-Blackroot-Instruct
db4ae3d7b608fd0e7490d2fcfa0436e56e21af33
17.857043
0
12.174
false
false
false
false
1.437522
0.3962
39.619986
0.476572
25.869793
0.040785
4.07855
0.278523
3.803132
0.401969
8.31276
0.329122
25.458038
false
false
2024-08-23
2024-09-04
1
DavidAU/L3-Jamet-12.2B-MK.V-Blackroot-Instruct (Merge)
DavidAU_L3-Lumimaid-12.2B-v0.1-OAS-Instruct_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/DavidAU/L3-Lumimaid-12.2B-v0.1-OAS-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">DavidAU/L3-Lumimaid-12.2B-v0.1-OAS-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/DavidAU__L3-Lumimaid-12.2B-v0.1-OAS-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
DavidAU/L3-Lumimaid-12.2B-v0.1-OAS-Instruct
65a9e957dc4211aa3d87fdf588767823af5cde3f
17.743439
1
12.174
false
false
false
false
1.424707
0.392403
39.240327
0.469302
24.504816
0.040785
4.07855
0.276846
3.579418
0.419427
11.261719
0.314162
23.795804
false
false
2024-08-24
2024-09-12
1
DavidAU/L3-Lumimaid-12.2B-v0.1-OAS-Instruct (Merge)
DavidAU_L3-SMB-Instruct-12.2B-F32_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/DavidAU/L3-SMB-Instruct-12.2B-F32" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">DavidAU/L3-SMB-Instruct-12.2B-F32</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/DavidAU__L3-SMB-Instruct-12.2B-F32-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
DavidAU/L3-SMB-Instruct-12.2B-F32
ac5e205a41b17a7b05b1b62f352aacc7e65b2f13
18.863875
1
12.174
false
false
false
false
1.382397
0.430322
43.032155
0.478641
26.130957
0.044562
4.456193
0.281879
4.250559
0.408729
9.624479
0.3312
25.688904
false
false
2024-08-25
2024-09-12
1
DavidAU/L3-SMB-Instruct-12.2B-F32 (Merge)
DavidAU_L3-Stheno-Maid-Blackroot-Grand-HORROR-16B_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/DavidAU/L3-Stheno-Maid-Blackroot-Grand-HORROR-16B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">DavidAU/L3-Stheno-Maid-Blackroot-Grand-HORROR-16B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/DavidAU__L3-Stheno-Maid-Blackroot-Grand-HORROR-16B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
DavidAU/L3-Stheno-Maid-Blackroot-Grand-HORROR-16B
7b626e50b6c35fcb064b8b039fcf30eae01c3fae
17.096786
0
16.537
false
false
false
false
2.922799
0.343893
34.389309
0.473633
26.692021
0.015861
1.586103
0.270973
2.796421
0.403115
8.55599
0.357048
28.560875
false
false
2024-08-23
2024-09-04
1
DavidAU/L3-Stheno-Maid-Blackroot-Grand-HORROR-16B (Merge)
DavidAU_L3-Stheno-v3.2-12.2B-Instruct_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/DavidAU/L3-Stheno-v3.2-12.2B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">DavidAU/L3-Stheno-v3.2-12.2B-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/DavidAU__L3-Stheno-v3.2-12.2B-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
DavidAU/L3-Stheno-v3.2-12.2B-Instruct
8271fc32a601a4fa5efbe58c41a0ef4181ad8836
18.790033
1
12.174
false
false
false
false
1.3977
0.402795
40.279459
0.484598
27.369623
0.053625
5.362538
0.275168
3.355705
0.41025
10.314583
0.334525
26.058289
false
false
2024-08-24
2024-09-12
1
DavidAU/L3-Stheno-v3.2-12.2B-Instruct (Merge)
DavidAU_L3.1-Dark-Planet-SpinFire-Uncensored-8B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/DavidAU/L3.1-Dark-Planet-SpinFire-Uncensored-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">DavidAU/L3.1-Dark-Planet-SpinFire-Uncensored-8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/DavidAU__L3.1-Dark-Planet-SpinFire-Uncensored-8B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
DavidAU/L3.1-Dark-Planet-SpinFire-Uncensored-8B
9e4ae1310a0d2c82d50fe2aedc94ef084901ac48
24.710302
4
8.03
false
false
false
true
0.630282
0.70427
70.427023
0.526091
32.461783
0.0929
9.29003
0.279362
3.914989
0.354125
2.498958
0.367021
29.669031
false
false
2024-11-10
2025-01-11
1
DavidAU/L3.1-Dark-Planet-SpinFire-Uncensored-8B (Merge)
DavieLion_Llama-3.2-1B-SPIN-iter0_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/DavieLion/Llama-3.2-1B-SPIN-iter0" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">DavieLion/Llama-3.2-1B-SPIN-iter0</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/DavieLion__Llama-3.2-1B-SPIN-iter0-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
DavieLion/Llama-3.2-1B-SPIN-iter0
bc1a37920fb5e3cb64a71a4deda649f33fecb95d
3.623817
llama3.2
0
1.236
true
false
false
false
0.36965
0.150677
15.067687
0.293008
2.100828
0
0
0.253356
0.447427
0.356542
2.734375
0.112533
1.392583
false
false
2024-12-27
2024-12-29
1
DavieLion/Llama-3.2-1B-SPIN-iter0 (Merge)
DavieLion_Llama-3.2-1B-SPIN-iter0_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/DavieLion/Llama-3.2-1B-SPIN-iter0" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">DavieLion/Llama-3.2-1B-SPIN-iter0</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/DavieLion__Llama-3.2-1B-SPIN-iter0-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
DavieLion/Llama-3.2-1B-SPIN-iter0
2c95189201f94c64fcf4c9a7edc4777741f18999
3.884983
llama3.2
0
1.236
true
false
false
false
0.354476
0.154923
15.492338
0.293726
2.330669
0
0
0.25755
1.006711
0.356479
3.059896
0.112783
1.420287
false
false
2024-12-27
2024-12-29
1
DavieLion/Llama-3.2-1B-SPIN-iter0 (Merge)
DavieLion_Llama-3.2-1B-SPIN-iter1_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/DavieLion/Llama-3.2-1B-SPIN-iter1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">DavieLion/Llama-3.2-1B-SPIN-iter1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/DavieLion__Llama-3.2-1B-SPIN-iter1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
DavieLion/Llama-3.2-1B-SPIN-iter1
8c632ae68bd385af2e2270933326edbcd0044e8c
3.714211
llama3.2
0
1.236
true
false
false
false
0.36086
0.157546
15.754642
0.294025
2.433772
0
0
0.250839
0.111857
0.364604
2.675521
0.111785
1.309471
false
false
2024-12-29
2024-12-29
1
DavieLion/Llama-3.2-1B-SPIN-iter1 (Merge)
DavieLion_Llama-3.2-1B-SPIN-iter2_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/DavieLion/Llama-3.2-1B-SPIN-iter2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">DavieLion/Llama-3.2-1B-SPIN-iter2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/DavieLion__Llama-3.2-1B-SPIN-iter2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
DavieLion/Llama-3.2-1B-SPIN-iter2
36c9b3fd7196c6bac0fbe8f1e9c4f4fb3bcc993a
3.570029
llama3.2
0
1.236
true
false
false
false
0.353538
0.137613
13.761265
0.298034
3.157343
0
0
0.254195
0.559284
0.355302
2.51276
0.112866
1.429521
false
false
2024-12-29
2024-12-29
1
DavieLion/Llama-3.2-1B-SPIN-iter2 (Merge)
DavieLion_Llama-3.2-1B-SPIN-iter3_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/DavieLion/Llama-3.2-1B-SPIN-iter3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">DavieLion/Llama-3.2-1B-SPIN-iter3</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/DavieLion__Llama-3.2-1B-SPIN-iter3-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
DavieLion/Llama-3.2-1B-SPIN-iter3
108557f0db9b6f7c35ba8b0d094ebd81be6fe9fd
3.479848
llama3.2
0
1.236
true
false
false
false
0.727929
0.133591
13.359109
0.297523
3.139502
0
0
0.253356
0.447427
0.349969
2.51276
0.112783
1.420287
false
false
2024-12-29
2024-12-29
1
DavieLion/Llama-3.2-1B-SPIN-iter3 (Merge)
DavieLion_Llama-3.2-1B-SPIN-iter3_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/DavieLion/Llama-3.2-1B-SPIN-iter3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">DavieLion/Llama-3.2-1B-SPIN-iter3</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/DavieLion__Llama-3.2-1B-SPIN-iter3-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
DavieLion/Llama-3.2-1B-SPIN-iter3
ae511fd6bae53efd2656dd3cc6fc87d0fc56356c
3.61369
llama3.2
0
1.236
true
false
false
false
0.36323
0.132392
13.239205
0.297224
3.028514
0
0
0.264262
1.901566
0.352667
2.083333
0.112866
1.429521
false
false
2024-12-29
2024-12-29
1
DavieLion/Llama-3.2-1B-SPIN-iter3 (Merge)
DavieLion_Lllma-3.2-1B_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/DavieLion/Lllma-3.2-1B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">DavieLion/Lllma-3.2-1B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/DavieLion__Lllma-3.2-1B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
DavieLion/Lllma-3.2-1B
5e0d3bc7ca705a41f897a870efd4ff6ce455e20c
3.819039
llama3.2
0
1.236
true
false
false
false
0.366875
0.160144
16.014397
0.296469
2.438123
0
0
0.244128
0
0.357813
3.059896
0.112616
1.401817
false
false
2024-12-27
2024-12-27
0
DavieLion/Lllma-3.2-1B
DebateLabKIT_Llama-3.1-Argunaut-1-8B-SFT_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/DebateLabKIT/Llama-3.1-Argunaut-1-8B-SFT" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">DebateLabKIT/Llama-3.1-Argunaut-1-8B-SFT</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/DebateLabKIT__Llama-3.1-Argunaut-1-8B-SFT-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
DebateLabKIT/Llama-3.1-Argunaut-1-8B-SFT
e9d7396bc0fa3d1ff4c1f4b1a0d81a1d1a7e977c
23.559679
llama3.1
5
8.03
true
false
false
true
0.716978
0.551921
55.192112
0.482383
27.187827
0.111782
11.178248
0.283557
4.474273
0.450302
15.854427
0.347241
27.471188
false
false
2024-12-31
2025-01-02
1
DebateLabKIT/Llama-3.1-Argunaut-1-8B-SFT (Merge)
Deci_DeciLM-7B_bfloat16
bfloat16
🟢 pretrained
🟢
Original
DeciLMForCausalLM
<a target="_blank" href="https://huggingface.co/Deci/DeciLM-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Deci/DeciLM-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Deci__DeciLM-7B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Deci/DeciLM-7B
c3c9f4226801dc0433f32aebffe0aac68ee2f051
14.960537
apache-2.0
225
7.044
true
false
false
false
0.642137
0.281295
28.129474
0.442286
21.25273
0.024924
2.492447
0.295302
6.040268
0.435854
13.048438
0.269199
18.799867
false
true
2023-12-10
2024-06-12
0
Deci/DeciLM-7B
Deci_DeciLM-7B-instruct_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
DeciLMForCausalLM
<a target="_blank" href="https://huggingface.co/Deci/DeciLM-7B-instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Deci/DeciLM-7B-instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Deci__DeciLM-7B-instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Deci/DeciLM-7B-instruct
4adc7aa9efe61b47b0a98b2cc94527d9c45c3b4f
17.457504
apache-2.0
96
7.044
true
false
false
true
0.638649
0.488024
48.8024
0.458975
23.887149
0.029456
2.945619
0.28943
5.257271
0.388417
5.985417
0.260805
17.867169
false
true
2023-12-10
2024-06-12
0
Deci/DeciLM-7B-instruct
DeepAutoAI_Explore_Llama-3.1-8B-Inst_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/DeepAutoAI/Explore_Llama-3.1-8B-Inst" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">DeepAutoAI/Explore_Llama-3.1-8B-Inst</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/DeepAutoAI__Explore_Llama-3.1-8B-Inst-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
DeepAutoAI/Explore_Llama-3.1-8B-Inst
9752180fafd8f584625eb649c0cba36b91bdc3ce
28.788231
apache-2.0
0
8.03
true
false
false
true
1.750239
0.779483
77.948288
0.511742
30.393263
0.192598
19.259819
0.283557
4.474273
0.390958
9.636458
0.379156
31.017287
false
false
2024-09-21
2024-10-09
1
DeepAutoAI/Explore_Llama-3.1-8B-Inst (Merge)
DeepAutoAI_Explore_Llama-3.2-1B-Inst_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/DeepAutoAI/Explore_Llama-3.2-1B-Inst" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">DeepAutoAI/Explore_Llama-3.2-1B-Inst</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/DeepAutoAI__Explore_Llama-3.2-1B-Inst-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
DeepAutoAI/Explore_Llama-3.2-1B-Inst
9fd790df246b8979c02173f7698819a7805fb04e
13.708555
apache-2.0
0
1.236
true
false
false
true
0.891846
0.564886
56.488561
0.350481
8.292274
0.063444
6.344411
0.255872
0.782998
0.318344
1.359635
0.180851
8.983452
false
false
2024-10-07
2024-10-09
1
DeepAutoAI/Explore_Llama-3.2-1B-Inst (Merge)
DeepAutoAI_Explore_Llama-3.2-1B-Inst_v0_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/DeepAutoAI/Explore_Llama-3.2-1B-Inst_v0" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">DeepAutoAI/Explore_Llama-3.2-1B-Inst_v0</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/DeepAutoAI__Explore_Llama-3.2-1B-Inst_v0-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
DeepAutoAI/Explore_Llama-3.2-1B-Inst_v0
9509dee6b01fff1a11dc26cf58d7eecbe3d9d9c4
13.182851
1
1.236
false
false
false
true
0.467189
0.559715
55.971489
0.336509
7.042772
0.049094
4.909366
0.263423
1.789709
0.310313
0.455729
0.180352
8.928044
false
false
2024-10-08
2024-10-08
0
DeepAutoAI/Explore_Llama-3.2-1B-Inst_v0
DeepAutoAI_Explore_Llama-3.2-1B-Inst_v1_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/DeepAutoAI/Explore_Llama-3.2-1B-Inst_v1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">DeepAutoAI/Explore_Llama-3.2-1B-Inst_v1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/DeepAutoAI__Explore_Llama-3.2-1B-Inst_v1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
DeepAutoAI/Explore_Llama-3.2-1B-Inst_v1
3f8b0fb6dcc1e9725ba52dd086241d5d9e413100
10.619319
apache-2.0
0
1.236
true
false
false
true
0.469966
0.499889
49.988918
0.314148
4.25778
0.01284
1.283988
0.244966
0
0.378094
5.195052
0.126912
2.990174
false
false
2024-10-08
2024-10-08
1
DeepAutoAI/Explore_Llama-3.2-1B-Inst_v1 (Merge)
DeepAutoAI_Explore_Llama-3.2-1B-Inst_v1.1_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/DeepAutoAI/Explore_Llama-3.2-1B-Inst_v1.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">DeepAutoAI/Explore_Llama-3.2-1B-Inst_v1.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/DeepAutoAI__Explore_Llama-3.2-1B-Inst_v1.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
DeepAutoAI/Explore_Llama-3.2-1B-Inst_v1.1
158b977bca89e073871e2313740a7c75eb1291af
14.211124
apache-2.0
0
1.236
true
false
false
true
0.912912
0.584419
58.441934
0.351266
8.818154
0.06571
6.570997
0.262584
1.677852
0.311708
0.663542
0.181848
9.094267
false
false
2024-10-09
2024-10-17
1
DeepAutoAI/Explore_Llama-3.2-1B-Inst_v1.1 (Merge)
DeepAutoAI_causal_gpt2_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
GPT2LMHeadModel
<a target="_blank" href="https://huggingface.co/DeepAutoAI/causal_gpt2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">DeepAutoAI/causal_gpt2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/DeepAutoAI__causal_gpt2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
DeepAutoAI/causal_gpt2
995f029f6645dde1ef830406001754b904c49775
5.981707
0
0.124
false
false
false
false
0.125865
0.181277
18.127679
0.302571
2.633344
0.002266
0.226586
0.260067
1.342282
0.426958
12.103125
0.113115
1.457225
false
false
2024-10-17
2024-10-17
0
DeepAutoAI/causal_gpt2
DeepAutoAI_d2nwg_Llama-3.1-8B-Instruct-v0.0_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/DeepAutoAI/d2nwg_Llama-3.1-8B-Instruct-v0.0" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">DeepAutoAI/d2nwg_Llama-3.1-8B-Instruct-v0.0</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/DeepAutoAI__d2nwg_Llama-3.1-8B-Instruct-v0.0-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
DeepAutoAI/d2nwg_Llama-3.1-8B-Instruct-v0.0
8bad8800d04a06f3f906728ee223cab2f50453a0
27.727687
0
8.03
false
false
false
true
0.856178
0.789275
78.927468
0.508041
30.510076
0.083837
8.383686
0.291946
5.592841
0.413469
10.983594
0.387716
31.968454
false
false
2024-09-10
2024-09-10
0
DeepAutoAI/d2nwg_Llama-3.1-8B-Instruct-v0.0
DeepAutoAI_d2nwg_causal_gpt2_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
GPT2LMHeadModel
<a target="_blank" href="https://huggingface.co/DeepAutoAI/d2nwg_causal_gpt2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">DeepAutoAI/d2nwg_causal_gpt2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/DeepAutoAI__d2nwg_causal_gpt2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
DeepAutoAI/d2nwg_causal_gpt2
eab065cba5a7a9b08f8b264d61d504c4ecbb611b
6.292853
0
0.124
false
false
false
false
0.129907
0.191618
19.161824
0.30269
2.850574
0.003776
0.377644
0.25755
1.006711
0.429719
12.68151
0.11511
1.678856
false
false
2024-10-18
2024-10-18
0
DeepAutoAI/d2nwg_causal_gpt2
DeepAutoAI_d2nwg_causal_gpt2_v1_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
GPT2LMHeadModel
<a target="_blank" href="https://huggingface.co/DeepAutoAI/d2nwg_causal_gpt2_v1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">DeepAutoAI/d2nwg_causal_gpt2_v1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/DeepAutoAI__d2nwg_causal_gpt2_v1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
DeepAutoAI/d2nwg_causal_gpt2_v1
3f40c3dcb3eb591dec80ff03573eec7928a7feaa
6.381801
0
0.124
false
false
false
false
0.230406
0.198862
19.886235
0.29919
2.387278
0.001511
0.151057
0.258389
1.118568
0.433688
13.244271
0.113531
1.503398
false
false
2024-10-18
2024-10-19
0
DeepAutoAI/d2nwg_causal_gpt2_v1
DeepAutoAI_ldm_soup_Llama-3.1-8B-Inst_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/DeepAutoAI/ldm_soup_Llama-3.1-8B-Inst" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">DeepAutoAI/ldm_soup_Llama-3.1-8B-Inst</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/DeepAutoAI__ldm_soup_Llama-3.1-8B-Inst-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
DeepAutoAI/ldm_soup_Llama-3.1-8B-Inst
0f04c5ad830f8ae0828191a4670fd4ba361b63d2
28.763892
apache-2.0
3
8.03
true
false
false
true
1.704023
0.803263
80.326312
0.512117
31.101628
0.123112
12.311178
0.28943
5.257271
0.416135
11.516927
0.38863
32.070035
false
false
2024-09-16
2024-10-09
1
DeepAutoAI/ldm_soup_Llama-3.1-8B-Inst (Merge)
DeepAutoAI_ldm_soup_Llama-3.1-8B-Instruct-v0.0_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/DeepAutoAI/ldm_soup_Llama-3.1-8B-Instruct-v0.0" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">DeepAutoAI/ldm_soup_Llama-3.1-8B-Instruct-v0.0</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/DeepAutoAI__ldm_soup_Llama-3.1-8B-Instruct-v0.0-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
DeepAutoAI/ldm_soup_Llama-3.1-8B-Instruct-v0.0
210a97b4dadbda63cc9fe459e8415d4cd3bbaf99
28.375728
0
8.03
false
false
false
true
0.860455
0.78895
78.894999
0.512518
31.162649
0.110272
11.02719
0.291107
5.480984
0.412135
11.516927
0.389545
32.171616
false
false
2024-09-14
2024-09-15
0
DeepAutoAI/ldm_soup_Llama-3.1-8B-Instruct-v0.0
DeepAutoAI_ldm_soup_Llama-3.1-8B-Instruct-v0.1_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/DeepAutoAI/ldm_soup_Llama-3.1-8B-Instruct-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">DeepAutoAI/ldm_soup_Llama-3.1-8B-Instruct-v0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/DeepAutoAI__ldm_soup_Llama-3.1-8B-Instruct-v0.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
DeepAutoAI/ldm_soup_Llama-3.1-8B-Instruct-v0.1
ecd140c95985b4292c896e25a94a7629d2924ad1
28.375728
0
8.03
false
false
false
true
0.828446
0.78895
78.894999
0.512518
31.162649
0.110272
11.02719
0.291107
5.480984
0.412135
11.516927
0.389545
32.171616
false
false
2024-09-15
2024-09-16
0
DeepAutoAI/ldm_soup_Llama-3.1-8B-Instruct-v0.1
DeepMount00_Lexora-Lite-3B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/DeepMount00/Lexora-Lite-3B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">DeepMount00/Lexora-Lite-3B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/DeepMount00__Lexora-Lite-3B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
DeepMount00/Lexora-Lite-3B
2cf39db7ecac17edca0bf4e0973b7fb58c40c22c
21.618583
0
3.086
false
false
false
true
2.2935
0.572104
57.210424
0.479713
27.199848
0.053625
5.362538
0.280201
4.026846
0.395552
7.877344
0.352311
28.034501
false
false
2024-09-19
2024-10-20
0
DeepMount00/Lexora-Lite-3B
DeepMount00_Lexora-Medium-7B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/DeepMount00/Lexora-Medium-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">DeepMount00/Lexora-Medium-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/DeepMount00__Lexora-Medium-7B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
DeepMount00/Lexora-Medium-7B
c53d166f4f2996a5b7f161529f1ea6548b54a2b2
24.653915
apache-2.0
5
7.616
true
false
false
true
1.734911
0.410338
41.03379
0.514484
32.695331
0.151057
15.10574
0.305369
7.38255
0.443948
14.760156
0.432513
36.945922
false
false
2024-09-24
2024-09-24
0
DeepMount00/Lexora-Medium-7B
DeepMount00_Llama-3-8b-Ita_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/DeepMount00/Llama-3-8b-Ita" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">DeepMount00/Llama-3-8b-Ita</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/DeepMount00__Llama-3-8b-Ita-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
DeepMount00/Llama-3-8b-Ita
d40847d2981b588690c1dc21d5157d3f4afb2978
26.733876
llama3
24
8.03
true
false
false
true
0.778258
0.75303
75.302974
0.493577
28.077746
0.062689
6.268882
0.305369
7.38255
0.426771
11.679688
0.385223
31.691415
false
false
2024-05-01
2024-06-27
1
meta-llama/Meta-Llama-3-8B
DeepMount00_Llama-3.1-8b-ITA_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/DeepMount00/Llama-3.1-8b-ITA" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">DeepMount00/Llama-3.1-8b-ITA</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/DeepMount00__Llama-3.1-8b-ITA-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
DeepMount00/Llama-3.1-8b-ITA
5ede1e388b6b15bc06acd364a8f805fe9ed16db9
28.228098
6
8.03
false
false
false
true
2.507574
0.791673
79.167276
0.510936
30.933181
0.108761
10.876133
0.287752
5.033557
0.413594
11.399219
0.387633
31.95922
false
false
2024-08-13
2024-10-28
2
meta-llama/Meta-Llama-3.1-8B
DeepMount00_Llama-3.1-Distilled_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/DeepMount00/Llama-3.1-Distilled" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">DeepMount00/Llama-3.1-Distilled</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/DeepMount00__Llama-3.1-Distilled-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
DeepMount00/Llama-3.1-Distilled
0a94c7ddb196107e8bf1b02e31488ff8c17b9eb3
28.838347
llama3
0
8.03
true
false
false
true
0.839
0.784379
78.437878
0.510088
30.841421
0.155589
15.558912
0.303691
7.158837
0.405812
10.126562
0.378158
30.906472
false
false
2024-10-25
2024-10-25
1
meta-llama/Meta-Llama-3-8B
DeepMount00_Qwen2.5-7B-Instruct-MathCoder_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/DeepMount00/Qwen2.5-7B-Instruct-MathCoder" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">DeepMount00/Qwen2.5-7B-Instruct-MathCoder</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/DeepMount00__Qwen2.5-7B-Instruct-MathCoder-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
DeepMount00/Qwen2.5-7B-Instruct-MathCoder
90df996cdb1f3d5f051513c50df4cdfda858b5f2
4.384322
0
7.616
false
false
false
true
1.29268
0.153025
15.302508
0.299844
2.636671
0
0
0.262584
1.677852
0.380635
5.379427
0.111785
1.309471
false
false
2024-10-24
0
Removed
DeepMount00_mergekit-ties-okvgjfz_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/DeepMount00/mergekit-ties-okvgjfz" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">DeepMount00/mergekit-ties-okvgjfz</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/DeepMount00__mergekit-ties-okvgjfz-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
DeepMount00/mergekit-ties-okvgjfz
90df996cdb1f3d5f051513c50df4cdfda858b5f2
4.384322
0
7.616
false
false
false
true
1.288821
0.153025
15.302508
0.299844
2.636671
0
0
0.262584
1.677852
0.380635
5.379427
0.111785
1.309471
false
false
2024-10-24
0
Removed
Delta-Vector_Baldur-8B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Delta-Vector/Baldur-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Delta-Vector/Baldur-8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Delta-Vector__Baldur-8B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Delta-Vector/Baldur-8B
97f5d321a8346551a5ed704997dd1e93c59883f3
24.128795
5
8
false
false
false
false
2.294232
0.478182
47.818233
0.530584
32.541834
0.139728
13.97281
0.302013
6.935123
0.437156
14.011198
0.365442
29.493573
false
false
2024-09-23
2024-10-06
1
Delta-Vector/Baldur-8B (Merge)
Delta-Vector_Control-8B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Delta-Vector/Control-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Delta-Vector/Control-8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Delta-Vector__Control-8B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Delta-Vector/Control-8B
c8743ee5ca0efd31aa9dd1bd14c770430c85a6c1
25.03285
2
8.03
false
false
false
true
0.677486
0.548973
54.897339
0.504146
29.155078
0.137462
13.746224
0.316275
8.836689
0.435542
13.209375
0.373172
30.352394
false
false
2024-10-23
2024-11-25
0
Delta-Vector/Control-8B
Delta-Vector_Control-8B-V1.1_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Delta-Vector/Control-8B-V1.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Delta-Vector/Control-8B-V1.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Delta-Vector__Control-8B-V1.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Delta-Vector/Control-8B-V1.1
6d4593645d1c4dc61d1c223922f635d79283d22b
24.582157
0
8.03
false
false
false
true
0.640598
0.569656
56.965629
0.499284
28.72585
0.124622
12.462236
0.307047
7.606264
0.423729
11.232813
0.374501
30.500148
false
false
2024-10-30
2024-11-25
0
Delta-Vector/Control-8B-V1.1
Delta-Vector_Darkens-8B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/Delta-Vector/Darkens-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Delta-Vector/Darkens-8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Delta-Vector__Darkens-8B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Delta-Vector/Darkens-8B
e82be0389bfcecd1998dba1c3bb35b8d95d01bf2
18.874475
4
8.414
false
false
false
false
1.199743
0.254766
25.476624
0.525059
32.883795
0.055136
5.513595
0.324664
9.955257
0.410552
9.01901
0.373587
30.398567
false
false
2024-09-22
2024-10-06
1
Delta-Vector/Darkens-8B (Merge)
Delta-Vector_Henbane-7b-attempt2_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/Delta-Vector/Henbane-7b-attempt2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Delta-Vector/Henbane-7b-attempt2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Delta-Vector__Henbane-7b-attempt2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Delta-Vector/Henbane-7b-attempt2
448ef54e5af03e13f16f3db8ad8d1481479ac12e
23.801362
apache-2.0
1
7
true
false
false
true
1.133838
0.415734
41.573359
0.506118
30.865849
0.226586
22.65861
0.290268
5.369128
0.397344
8.701302
0.402759
33.639923
false
false
2024-09-13
2024-10-11
1
Qwen/Qwen2-7B
Delta-Vector_Odin-9B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/Delta-Vector/Odin-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Delta-Vector/Odin-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Delta-Vector__Odin-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Delta-Vector/Odin-9B
9ff20f5dd427e751ada834319bfdd9ea60b5e89c
24.914172
4
9.242
false
false
false
false
2.708162
0.369197
36.919706
0.544025
34.832423
0.141239
14.123867
0.341443
12.192394
0.464781
17.564323
0.404671
33.85232
false
false
2024-09-27
2024-10-06
0
Delta-Vector/Odin-9B
Delta-Vector_Tor-8B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/Delta-Vector/Tor-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Delta-Vector/Tor-8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Delta-Vector__Tor-8B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Delta-Vector/Tor-8B
d30a7a121c2ef5dc14004cfdf3fd13208dfbdb4f
18.419467
2
8.414
false
false
false
false
1.252053
0.238155
23.815476
0.520911
31.738224
0.059668
5.966767
0.323826
9.8434
0.409219
8.81901
0.373005
30.333924
false
false
2024-09-21
2024-10-06
1
Delta-Vector/Tor-8B (Merge)
DoppelReflEx_MN-12B-Kakigori_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/DoppelReflEx/MN-12B-Kakigori" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">DoppelReflEx/MN-12B-Kakigori</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/DoppelReflEx__MN-12B-Kakigori-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
DoppelReflEx/MN-12B-Kakigori
43cdb3d3df47f5d4ed8386f411859b9d72ea9017
21.622204
1
12.248
false
false
false
false
0.79783
0.35933
35.932991
0.541553
34.331347
0.114804
11.480363
0.324664
9.955257
0.405219
9.352344
0.358128
28.680925
false
false
2025-01-29
2025-01-29
1
DoppelReflEx/MN-12B-Kakigori (Merge)
DoppelReflEx_MN-12B-LilithFrame_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/DoppelReflEx/MN-12B-LilithFrame" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">DoppelReflEx/MN-12B-LilithFrame</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/DoppelReflEx__MN-12B-LilithFrame-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
DoppelReflEx/MN-12B-LilithFrame
e3e8cce8267613d5c2ff68884aaeac8ab9b39e93
20.389539
0
12.248
false
false
false
false
0.928253
0.450955
45.095458
0.494426
27.492064
0.059668
5.966767
0.319631
9.284116
0.389563
9.428646
0.325632
25.070183
false
false
2025-01-29
2025-01-29
1
DoppelReflEx/MN-12B-LilithFrame (Merge)
DoppelReflEx_MN-12B-LilithFrame_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/DoppelReflEx/MN-12B-LilithFrame" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">DoppelReflEx/MN-12B-LilithFrame</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/DoppelReflEx__MN-12B-LilithFrame-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
DoppelReflEx/MN-12B-LilithFrame
e3e8cce8267613d5c2ff68884aaeac8ab9b39e93
20.02254
0
12.248
false
false
false
false
0.928005
0.436042
43.604192
0.495613
27.653498
0.058912
5.891239
0.32047
9.395973
0.38426
8.732552
0.32372
24.857787
false
false
2025-01-29
2025-01-29
1
DoppelReflEx/MN-12B-LilithFrame (Merge)
DoppelReflEx_MN-12B-LilithFrame-Experiment-2_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/DoppelReflEx/MN-12B-LilithFrame-Experiment-2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">DoppelReflEx/MN-12B-LilithFrame-Experiment-2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/DoppelReflEx__MN-12B-LilithFrame-Experiment-2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
DoppelReflEx/MN-12B-LilithFrame-Experiment-2
75316e8ed913cf62482f36713a007d471813bb0e
20.234215
0
12.248
false
false
false
false
0.919568
0.429947
42.994699
0.498267
28.111183
0.061178
6.117825
0.325503
10.067114
0.380448
8.822656
0.327626
25.291814
false
false
2025-01-29
2025-01-29
1
DoppelReflEx/MN-12B-LilithFrame-Experiment-2 (Merge)
DoppelReflEx_MN-12B-Mimicore-GreenSnake_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/DoppelReflEx/MN-12B-Mimicore-GreenSnake" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">DoppelReflEx/MN-12B-Mimicore-GreenSnake</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/DoppelReflEx__MN-12B-Mimicore-GreenSnake-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
DoppelReflEx/MN-12B-Mimicore-GreenSnake
c1aee5ad2926129a5299e264a33c3890eb83cb8f
24.863956
cc-by-nc-4.0
2
12.248
true
false
false
false
0.844301
0.478007
47.800724
0.548051
35.390601
0.129909
12.990937
0.324664
9.955257
0.430583
13.589583
0.36511
29.456634
true
false
2025-01-27
2025-01-27
1
DoppelReflEx/MN-12B-Mimicore-GreenSnake (Merge)
DoppelReflEx_MN-12B-Mimicore-Orochi_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/DoppelReflEx/MN-12B-Mimicore-Orochi" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">DoppelReflEx/MN-12B-Mimicore-Orochi</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/DoppelReflEx__MN-12B-Mimicore-Orochi-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
DoppelReflEx/MN-12B-Mimicore-Orochi
59515c9a5224bb45a1d2a7ea141e37a5ab9a9021
24.261991
2
12.248
false
false
false
false
0.760481
0.462045
46.204515
0.549774
35.28323
0.112538
11.253776
0.312919
8.389262
0.454583
17.25625
0.344664
27.184914
false
false
2025-01-28
2025-01-28
1
DoppelReflEx/MN-12B-Mimicore-Orochi (Merge)
DoppelReflEx_MN-12B-Mimicore-Orochi-v2-Experiment_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/DoppelReflEx/MN-12B-Mimicore-Orochi-v2-Experiment" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">DoppelReflEx/MN-12B-Mimicore-Orochi-v2-Experiment</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/DoppelReflEx__MN-12B-Mimicore-Orochi-v2-Experiment-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
DoppelReflEx/MN-12B-Mimicore-Orochi-v2-Experiment
b0140973cf249ecb2ba399f1174f8229c91dc363
18.859902
0
12.248
false
false
false
false
0.551141
0.284241
28.424137
0.532253
32.774711
0.004532
0.453172
0.297819
6.375839
0.457375
18.205208
0.342337
26.926345
false
false
2025-01-28
0
Removed
DoppelReflEx_MN-12B-Mimicore-Orochi-v3-Experiment_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/DoppelReflEx/MN-12B-Mimicore-Orochi-v3-Experiment" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">DoppelReflEx/MN-12B-Mimicore-Orochi-v3-Experiment</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/DoppelReflEx__MN-12B-Mimicore-Orochi-v3-Experiment-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
DoppelReflEx/MN-12B-Mimicore-Orochi-v3-Experiment
d1f9bd2cd64564217f59802648a941a57b2b9733
21.558445
0
12.248
false
false
false
false
0.667464
0.410163
41.016281
0.543782
34.56948
0.056647
5.664653
0.292785
5.704698
0.443792
15.773958
0.339594
26.621602
false
false
2025-01-28
0
Removed
DoppelReflEx_MN-12B-Mimicore-Orochi-v4-Experiment_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/DoppelReflEx/MN-12B-Mimicore-Orochi-v4-Experiment" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">DoppelReflEx/MN-12B-Mimicore-Orochi-v4-Experiment</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/DoppelReflEx__MN-12B-Mimicore-Orochi-v4-Experiment-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
DoppelReflEx/MN-12B-Mimicore-Orochi-v4-Experiment
41bc20297c95adc8bc1d2e993110f671907f0c32
23.27366
0
12.248
false
false
false
false
0.943227
0.43207
43.207024
0.54625
35.299068
0.102719
10.271903
0.305369
7.38255
0.444938
15.483854
0.351978
27.997562
false
false
2025-01-28
0
Removed
DoppelReflEx_MN-12B-Mimicore-WhiteSnake_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/DoppelReflEx/MN-12B-Mimicore-WhiteSnake" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">DoppelReflEx/MN-12B-Mimicore-WhiteSnake</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/DoppelReflEx__MN-12B-Mimicore-WhiteSnake-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
DoppelReflEx/MN-12B-Mimicore-WhiteSnake
ca84b8ab989a61658fc17e270b7344ed3885071f
24.819386
cc-by-nc-4.0
3
12.248
true
false
false
false
0.799397
0.44376
44.376033
0.560461
36.89971
0.117069
11.706949
0.317953
9.060403
0.456875
17.342708
0.365775
29.530511
true
false
2025-01-27
2025-01-27
1
DoppelReflEx/MN-12B-Mimicore-WhiteSnake (Merge)
DoppelReflEx_MN-12B-Mimicore-WhiteSnake-v2-Experiment-1_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/DoppelReflEx/MN-12B-Mimicore-WhiteSnake-v2-Experiment-1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">DoppelReflEx/MN-12B-Mimicore-WhiteSnake-v2-Experiment-1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/DoppelReflEx__MN-12B-Mimicore-WhiteSnake-v2-Experiment-1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
DoppelReflEx/MN-12B-Mimicore-WhiteSnake-v2-Experiment-1
f1fb881039e54ac80d84298b9054773a2bd72d21
17.834454
0
12.248
false
false
false
false
0.938459
0.390904
39.090391
0.486564
27.077964
0.016616
1.661631
0.305369
7.38255
0.378958
8.303125
0.31142
23.491061
false
false
2025-01-29
2025-01-29
1
DoppelReflEx/MN-12B-Mimicore-WhiteSnake-v2-Experiment-1 (Merge)
DoppelReflEx_MN-12B-Mimicore-WhiteSnake-v2-Experiment-2_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/DoppelReflEx/MN-12B-Mimicore-WhiteSnake-v2-Experiment-2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">DoppelReflEx/MN-12B-Mimicore-WhiteSnake-v2-Experiment-2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/DoppelReflEx__MN-12B-Mimicore-WhiteSnake-v2-Experiment-2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
DoppelReflEx/MN-12B-Mimicore-WhiteSnake-v2-Experiment-2
18.10079
0
12.248
false
false
false
false
1.748362
0.312393
31.239334
0.51264
30.66572
0.033233
3.323263
0.296141
6.152125
0.397469
11.516927
0.331366
25.707373
false
false
2025-01-29
2025-01-29
1
DoppelReflEx/MN-12B-Mimicore-WhiteSnake-v2-Experiment-2 (Merge)
DoppelReflEx_MN-12B-Mimicore-WhiteSnake-v2-Experiment-3_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/DoppelReflEx/MN-12B-Mimicore-WhiteSnake-v2-Experiment-3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">DoppelReflEx/MN-12B-Mimicore-WhiteSnake-v2-Experiment-3</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/DoppelReflEx__MN-12B-Mimicore-WhiteSnake-v2-Experiment-3-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
DoppelReflEx/MN-12B-Mimicore-WhiteSnake-v2-Experiment-3
12985da577e2bdcba11ad75b4aad6cf07cb67b51
18.569295
0
12.248
false
false
false
false
0.907159
0.430222
43.022181
0.48118
26.321395
0.027946
2.794562
0.302013
6.935123
0.368417
7.91875
0.319814
24.423759
false
false
2025-01-29
2025-01-29
1
DoppelReflEx/MN-12B-Mimicore-WhiteSnake-v2-Experiment-3 (Merge)
DoppelReflEx_MN-12B-Mimicore-WhiteSnake-v2-Experiment-4_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/DoppelReflEx/MN-12B-Mimicore-WhiteSnake-v2-Experiment-4" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">DoppelReflEx/MN-12B-Mimicore-WhiteSnake-v2-Experiment-4</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/DoppelReflEx__MN-12B-Mimicore-WhiteSnake-v2-Experiment-4-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
DoppelReflEx/MN-12B-Mimicore-WhiteSnake-v2-Experiment-4
b7ec319e84b66dba6c620b9b01dc579cad96eb8d
20.636234
0
12.248
false
false
false
false
0.86356
0.424052
42.405152
0.518475
31.422947
0.044562
4.456193
0.310403
8.053691
0.400198
11.458073
0.334192
26.02135
false
false
2025-01-29
2025-01-29
1
DoppelReflEx/MN-12B-Mimicore-WhiteSnake-v2-Experiment-4 (Merge)
DreadPoor_Again-8B-Model_Stock_float16
float16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/DreadPoor/Again-8B-Model_Stock" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">DreadPoor/Again-8B-Model_Stock</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/DreadPoor__Again-8B-Model_Stock-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
DreadPoor/Again-8B-Model_Stock
10052b086c6896ccd9d26522c45d348f1607c33c
25.914271
0
4.015
false
false
false
true
0.680327
0.672421
67.24214
0.53098
33.259461
0.114804
11.480363
0.301174
6.823266
0.398677
8.701302
0.351812
27.979093
false
false
2024-12-17
0
Removed