LG-AI-EXAONE commited on
Commit
b6a337b
·
0 Parent(s):

Initial commit

Browse files
.gitattributes ADDED
@@ -0,0 +1,37 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ *.7z filter=lfs diff=lfs merge=lfs -text
2
+ *.arrow filter=lfs diff=lfs merge=lfs -text
3
+ *.bin filter=lfs diff=lfs merge=lfs -text
4
+ *.bz2 filter=lfs diff=lfs merge=lfs -text
5
+ *.ckpt filter=lfs diff=lfs merge=lfs -text
6
+ *.ftz filter=lfs diff=lfs merge=lfs -text
7
+ *.gz filter=lfs diff=lfs merge=lfs -text
8
+ *.h5 filter=lfs diff=lfs merge=lfs -text
9
+ *.joblib filter=lfs diff=lfs merge=lfs -text
10
+ *.lfs.* filter=lfs diff=lfs merge=lfs -text
11
+ *.mlmodel filter=lfs diff=lfs merge=lfs -text
12
+ *.model filter=lfs diff=lfs merge=lfs -text
13
+ *.msgpack filter=lfs diff=lfs merge=lfs -text
14
+ *.npy filter=lfs diff=lfs merge=lfs -text
15
+ *.npz filter=lfs diff=lfs merge=lfs -text
16
+ *.onnx filter=lfs diff=lfs merge=lfs -text
17
+ *.ot filter=lfs diff=lfs merge=lfs -text
18
+ *.parquet filter=lfs diff=lfs merge=lfs -text
19
+ *.pb filter=lfs diff=lfs merge=lfs -text
20
+ *.pickle filter=lfs diff=lfs merge=lfs -text
21
+ *.pkl filter=lfs diff=lfs merge=lfs -text
22
+ *.pt filter=lfs diff=lfs merge=lfs -text
23
+ *.pth filter=lfs diff=lfs merge=lfs -text
24
+ *.rar filter=lfs diff=lfs merge=lfs -text
25
+ *.safetensors filter=lfs diff=lfs merge=lfs -text
26
+ saved_model/**/* filter=lfs diff=lfs merge=lfs -text
27
+ *.tar.* filter=lfs diff=lfs merge=lfs -text
28
+ *.tar filter=lfs diff=lfs merge=lfs -text
29
+ *.tflite filter=lfs diff=lfs merge=lfs -text
30
+ *.tgz filter=lfs diff=lfs merge=lfs -text
31
+ *.wasm filter=lfs diff=lfs merge=lfs -text
32
+ *.xz filter=lfs diff=lfs merge=lfs -text
33
+ *.zip filter=lfs diff=lfs merge=lfs -text
34
+ *.zst filter=lfs diff=lfs merge=lfs -text
35
+ *tfevents* filter=lfs diff=lfs merge=lfs -text
36
+ *.gguf filter=lfs diff=lfs merge=lfs -text
37
+ *.png filter=lfs diff=lfs merge=lfs -text
EXAONE-4.0-1.2B-BF16.gguf ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:a8fb80177b4d16ac77d3ceb8fd4831058358ad4aee6fadc800e24797eef78b88
3
+ size 2563248416
EXAONE-4.0-1.2B-IQ4_XS.gguf ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:7abd23b5482b33458e9c87ee93e153e5cb1c49850c2d0599cfae7f4e80e4c189
3
+ size 753799456
EXAONE-4.0-1.2B-Q4_K_M.gguf ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:7b5e753540183ae4d56e6febd9b48cdd944de53386e6faa8f51c8f98cb2b47df
3
+ size 812437792
EXAONE-4.0-1.2B-Q5_K_M.gguf ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:8a7e39590e2686d990e6597f8dc95781aa084eb280c36ddd46d88dd759404972
3
+ size 929616160
EXAONE-4.0-1.2B-Q6_K.gguf ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:296e57a27bcb96a0aa4c7e37d9a90a59ec1c8465b6cce86533ee40a4816c405a
3
+ size 1054118176
EXAONE-4.0-1.2B-Q8_0.gguf ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:cc0b2a3f447e134cafd2853104d06227122cc280f4c9fee8c90172066174ef04
3
+ size 1363939616
LICENSE ADDED
@@ -0,0 +1,157 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ EXAONE AI Model License Agreement 1.2 - NC
2
+
3
+ This License Agreement (“Agreement”) is entered into between you (“Licensee”) and LG Management Development
4
+ Institute Co., Ltd. (“Licensor”), governing the use of the EXAONE AI Model (“Model”). By downloading,
5
+ installing, copying, or using the Model, you agree to comply with and be bound by the terms of this Agreement.
6
+ If you do not agree to all the terms, you must not download, install, copy, or use the Model. This Agreement
7
+ constitutes a binding legal agreement between the Licensee and Licensor.
8
+
9
+ 1. Definitions
10
+ 1.1 Model: The artificial intelligence model provided by Licensor, which includes any software,
11
+ algorithms, machine learning models, or related components supplied by Licensor. This definition extends
12
+ to encompass all updates, enhancements, improvements, bug fixes, patches, or other modifications that may
13
+ be provided by Licensor from time to time, whether automatically or manually implemented.
14
+ 1.2 Derivatives: Any modifications, alterations, enhancements, improvements, adaptations, or derivative
15
+ works of the Model created by Licensee or any third party. This includes changes made to the Model's
16
+ architecture, parameters, data processing methods, or any other aspect of the Model that results in a
17
+ modification of its functionality or output.
18
+ 1.3 Output: Any data, results, content, predictions, analyses, insights, or other materials generated by
19
+ the Model or Derivatives, regardless of whether they are in their original form or have been further
20
+ processed or modified by the Licensee. This includes, but is not limited to, textual or numerical produced
21
+ directly or indirectly through the use of the Model.
22
+ 1.4 Licensor: LG Management Development Institute Co., Ltd., the owner, developer, and provider of the
23
+ EXAONE AI Model. The Licensor holds all rights, title, and interest in the Model and is responsible for
24
+ granting licenses to use the Model under the terms specified in this Agreement.
25
+ 1.5 Licensee: The individual, organization, corporation, academic institution, government agency, or other
26
+ entity using or intending to use the Model under the terms and conditions of this Agreement. The Licensee
27
+ is responsible for ensuring compliance with the Agreement by all authorized users who access or utilize
28
+ the Model on behalf of the Licensee.
29
+
30
+ 2. License Grant
31
+ 2.1 Grant of License: Subject to the terms and conditions outlined in this Agreement, the Licensor hereby
32
+ grants the Licensee a limited, non-exclusive, non-transferable, worldwide, and revocable license to:
33
+ a. Access, download, install, and use the Model solely for research and educational purposes. This
34
+ includes evaluation, testing, academic research, experimentation, learning, teaching, training and
35
+ participation in competitions, provided that such participation is in a non-commercial context.
36
+ Notwithstanding Section 3.1, the Licensee may only provide the Model or Derivatives for a competition
37
+ if no commercial license is granted to the competition organizer or any third party.
38
+ b. Publicly disclose research results and findings derived from the use of the Model or Derivatives,
39
+ including publishing papers or presentations.
40
+ c. Modify the Model and create Derivatives based on the Model, provided that such modifications and
41
+ Derivatives are used exclusively for research and educational purposes. The Licensee may conduct
42
+ experiments, perform analyses, and apply custom modifications to the Model to explore its capabilities
43
+ and performance under various scenarios. If the Model is modified, the modified Model must include
44
+ "EXAONE" at the beginning of its name.
45
+ d. Distribute the Model and Derivatives in each case with a copy of this Agreement.
46
+ 2.2 Scope of License: The license granted herein does not authorize the Licensee to use the Model for any
47
+ purpose not explicitly permitted under this Agreement. Any use beyond the scope of this license, including
48
+ any commercial application or external distribution, is strictly prohibited unless explicitly agreed upon
49
+ in writing by the Licensor.
50
+
51
+ 3. Restrictions
52
+ 3.1 Commercial Use: The Licensee is expressly prohibited from using the Model, Derivatives, or Output for
53
+ any commercial purposes, including but not limited to, developing or deploying products, services, or
54
+ applications that generate revenue, whether directly or indirectly. Any commercial exploitation of the
55
+ Model or its derivatives requires a separate commercial license agreement with the Licensor. Furthermore,
56
+ the Licensee shall not use the Model, Derivatives or Output to develop or improve any models that compete
57
+ with the Licensor’s models.
58
+ 3.2 Reverse Engineering: The Licensee shall not decompile, disassemble, reverse engineer, or attempt to
59
+ derive the source code, underlying ideas, algorithms, or structure of the Model, except to the extent that
60
+ such activities are expressly permitted by applicable law. Any attempt to bypass or circumvent
61
+ technological protection measures applied to the Model is strictly prohibited.
62
+ 3.3 Unlawful Use: The Licensee shall not use the Model and Derivatives for any illegal, fraudulent, or
63
+ unauthorized activities, nor for any purpose that violates applicable laws or regulations. This includes
64
+ but is not limited to the creation, distribution, or dissemination of malicious, deceptive, or unlawful
65
+ content.
66
+ 3.4 Ethical Use: The Licensee shall ensure that the Model or Derivatives is used in an ethical and
67
+ responsible manner, adhering to the following guidelines:
68
+ a. The Model and Derivatives shall not be used to generate, propagate, or amplify false, misleading,
69
+ or harmful information, including fake news, misinformation, or disinformation.
70
+ b. The Model and Derivatives shall not be employed to create, distribute, or promote content that is
71
+ discriminatory, harassing, defamatory, abusive, or otherwise offensive to individuals or groups based
72
+ on race, gender, sexual orientation, religion, nationality, or other protected characteristics.
73
+ c. The Model and Derivatives shall not infringe on the rights of others, including intellectual property
74
+ rights, privacy rights, or any other rights recognized by law. The Licensee shall obtain all necessary
75
+ permissions and consents before using the Model and Derivatives in a manner that may impact the rights
76
+ of third parties.
77
+ d. The Model and Derivatives shall not be used in a way that causes harm, whether physical, mental,
78
+ emotional, or financial, to individuals, organizations, or communities. The Licensee shall take all
79
+ reasonable measures to prevent misuse or abuse of the Model and Derivatives that could result in harm
80
+ or injury.
81
+
82
+ 4. Ownership
83
+ 4.1 Intellectual Property: All rights, title, and interest in and to the Model, including any
84
+ modifications, Derivatives, and associated documentation, are and shall remain the exclusive property of
85
+ the Licensor. The Licensee acknowledges that this Agreement does not transfer any ownership rights to the
86
+ Licensee. All trademarks, service marks, and logos associated with the Model are the property of the
87
+ Licensor.
88
+ 4.2 Output: Licensor claims no rights in Output. Licensee is solely responsible for the Output and its use.
89
+ 4.3 Attribution: In any publication or presentation of results obtained using the Model, the Licensee
90
+ shall provide appropriate attribution to the Licensor, citing the Model's name and version, along with any
91
+ relevant documentation or references specified by the Licensor.
92
+
93
+ 5. No Warranty
94
+ 5.1 “As-Is” Basis: The Model, Derivatives, and Output are provided on an “as-is” and “as-available” basis,
95
+ without any warranties or representations of any kind, whether express, implied, or statutory. The Licensor
96
+ disclaims all warranties, including but not limited to, implied warranties of merchantability, fitness for
97
+ a particular purpose, accuracy, reliability, non-infringement, or any warranty arising from the course of
98
+ dealing or usage of trade.
99
+ 5.2 Performance and Reliability: The Licensor does not warrant or guarantee that the Model, Derivatives or
100
+ Output will meet the Licensee’s requirements, that the operation of the Model, Derivatives or Output will
101
+ be uninterrupted or error-free, or that defects in the Model will be corrected. The Licensee acknowledges
102
+ that the use of the Model, Derivatives or Output is at its own risk and that the Model, Derivatives or
103
+ Output may contain bugs, errors, or other limitations.
104
+ 5.3 No Endorsement: The Licensor does not endorse, approve, or certify any results, conclusions, or
105
+ recommendations derived from the use of the Model. The Licensee is solely responsible for evaluating the
106
+ accuracy, reliability, and suitability of the Model for its intended purposes.
107
+
108
+ 6. Limitation of Liability
109
+ 6.1 No Liability for Damages: To the fullest extent permitted by applicable law, in no event shall the
110
+ Licensor be liable for any special, incidental, indirect, consequential, exemplary, or punitive damages,
111
+ including but not limited to, damages for loss of business profits, business interruption, loss of business
112
+ information, loss of data, or any other pecuniary or non-pecuniary loss arising out of or in connection with
113
+ the use or inability to use the Model, Derivatives or any Output, even if the Licensor has been advised of
114
+ the possibility of such damages.
115
+ 6.2 Indemnification: The Licensee agrees to indemnify, defend, and hold harmless the Licensor, its
116
+ affiliates, officers, directors, employees, and agents from and against any claims, liabilities, damages,
117
+ losses, costs, or expenses (including reasonable attorneys' fees) arising out of or related to the
118
+ Licensee's use of the Model, any Derivatives, or any Output, including any violation of this Agreement or
119
+ applicable laws.
120
+
121
+ 7. Termination
122
+ 7.1 Termination by Licensor: The Licensor reserves the right to terminate this Agreement and revoke the
123
+ Licensee’s rights to use the Model at any time, with or without cause, and without prior notice if the
124
+ Licensee breaches any of the terms or conditions of this Agreement. Termination shall be effective
125
+ immediately upon notice.
126
+ 7.2 Effect of Termination: Upon termination of this Agreement, the Licensee must immediately cease all use
127
+ of the Model and Derivatives and destroy all copies of the Model and Derivatives in its possession or
128
+ control, including any backup or archival copies. The Licensee shall certify in writing to the Licensor that
129
+ such destruction has been completed.
130
+ 7.3 Survival: The provisions of this Agreement that by their nature should survive termination, including
131
+ but not limited to, Sections 4 (Ownership), 5 (No Warranty), 6 (Limitation of Liability), and this Section 7
132
+ (Termination), shall continue to apply after termination.
133
+
134
+ 8. Governing Law
135
+ 8.1 Governing Law: This Agreement shall be governed by and construed in accordance with the laws of the
136
+ Republic of Korea, without regard to its conflict of laws principles.
137
+ 8.2 Arbitration: Any disputes, controversies, or claims arising out of or relating to this Agreement,
138
+ including its existence, validity, interpretation, performance, breach, or termination, shall be referred
139
+ to and finally resolved by arbitration administered by the Korean Commercial Arbitration Board (KCAB) in
140
+ accordance with the International Arbitration Rules of the Korean Commercial Arbitration Board in force at
141
+ the time of the commencement of the arbitration. The seat of arbitration shall be Seoul, Republic of Korea.
142
+ The tribunal shall consist of one arbitrator. The language of the arbitration shall be English.
143
+
144
+ 9. Alterations
145
+ 9.1 Modifications: The Licensor reserves the right to modify or amend this Agreement at any time, in its
146
+ sole discretion. Any modifications will be effective upon posting the updated Agreement on the Licensor’s
147
+ website or through other means of communication. The Licensee is responsible for reviewing the Agreement
148
+ periodically for changes. Continued use of the Model after any modifications have been made constitutes
149
+ acceptance of the revised Agreement.
150
+ 9.2 Entire Agreement: This Agreement constitutes the entire agreement between the Licensee and Licensor
151
+ concerning the subject matter hereof and supersedes all prior or contemporaneous oral or written agreements,
152
+ representations, or understandings. Any terms or conditions of any purchase order or other document
153
+ submitted by the Licensee in connection with the Model that are in addition to, different from, or
154
+ inconsistent with the terms and conditions of this Agreement are not binding on the Licensor and are void.
155
+
156
+ By downloading, installing, or using the EXAONE AI Model, the Licensee acknowledges that it has read,
157
+ understood, and agrees to be bound by the terms and conditions of this Agreement.
README.md ADDED
@@ -0,0 +1,1095 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ base_model: LGAI-EXAONE/EXAONE-4.0-1.2B
3
+ base_model_relation: quantized
4
+ license: other
5
+ license_name: exaone
6
+ license_link: LICENSE
7
+ language:
8
+ - en
9
+ - ko
10
+ - es
11
+ tags:
12
+ - lg-ai
13
+ - exaone
14
+ - exaone-4.0
15
+ pipeline_tag: text-generation
16
+ library_name: transformers
17
+ ---
18
+
19
+ <p align="center">
20
+ <img src="assets/EXAONE_Symbol+BI_3d.png", width="300", style="margin: 40 auto;">
21
+ 🎉 License Updated! We are pleased to announce our more flexible licensing terms 🤗
22
+ <br>
23
+
24
+ # EXAONE-4.0-1.2B-GGUF
25
+
26
+ ## Introduction
27
+
28
+ We introduce **EXAONE 4.0**, which integrates a **Non-reasoning mode** and **Reasoning mode** to achieve both the excellent usability of [EXAONE 3.5](https://github.com/LG-AI-EXAONE/EXAONE-3.5) and the advanced reasoning abilities of [EXAONE Deep](https://github.com/LG-AI-EXAONE/EXAONE-Deep). To pave the way for the agentic AI era, EXAONE 4.0 incorporates essential features such as agentic tool use, and its multilingual capabilities are extended
29
+ to support Spanish in addition to English and Korean.
30
+
31
+ The EXAONE 4.0 model series consists of two sizes: a mid-size **32B** model optimized for high performance, and a small-size **1.2B** model designed for on-device applications.
32
+
33
+ In the EXAONE 4.0 architecture, we apply new architectural changes compared to previous EXAONE models as below:
34
+
35
+ 1. **Hybrid Attention**: For the 32B model, we adopt hybrid attention scheme, which combines *Local attention (sliding window attention)* with *Global attention (full attention)* in a 3:1 ratio. We do not use RoPE (Rotary Positional Embedding) for global attention for better global context understanding.
36
+ 2. **QK-Reorder-Norm**: We adopt the Post-LN (LayerNorm) scheme for transformer blocks instead of Pre-LN, and we add RMS normalization right after the Q and K projection. It helps yield better performance on downstream tasks despite consuming more computation.
37
+
38
+ For more details, please refer to our [technical report](https://www.lgresearch.ai/data/cdn/upload/EXAONE_4_0.pdf), [blog](#), and [GitHub](https://github.com/LG-AI-EXAONE/EXAONE-4.0).
39
+
40
+
41
+ ### Model Configuration
42
+
43
+ - Number of Parameters (without embeddings): [[num_params_wo_embeddings]]
44
+ - Number of Layers: [[num_layers]]
45
+ - Number of Attention Heads: [[num_heads]]
46
+ - Vocab Size: 102,400
47
+ - Context Length: [[context_length]] tokens
48
+ [[quantization]]
49
+
50
+ ## Quickstart
51
+
52
+ ### llama.cpp
53
+ You can run EXAONE models locally using llama.cpp by following these steps:
54
+
55
+ 1. Install the latest version of llama.cpp, by cloning the our PR and building from source. Please refer to the official documentation about [building from source](https://github.com/ggml-org/llama.cpp/blob/master/docs/build.md).
56
+
57
+ ```bash
58
+ git clone --single-branch -b add-exaone4 https://github.com/lgai-exaone/llama.cpp.git
59
+ ```
60
+
61
+ 2. Download the EXAONE 4.0 model weights in GGUF format.
62
+
63
+ ```bash
64
+ huggingface-cli download LGAI-EXAONE/EXAONE-4.0-1.2B-GGUF-GGUF \
65
+ --include "EXAONE-4.0-1.2B-GGUF-Q4_K_M.gguf" \
66
+ --local-dir .
67
+ ```
68
+
69
+
70
+ <details>
71
+ <summary>Generation with `llama-cli`</summary>
72
+
73
+ 3. Apply chat template using transformers.
74
+
75
+ > This process is necessary to avoid issues with current EXAONE modeling code in `llama.cpp`. This is work in progress at our [PR](https://github.com/ggml-org/llama.cpp/pull/14630). We will update this once these issues are solved.
76
+
77
+ ```python
78
+ from transformers import AutoModelForCausalLM, AutoTokenizer
79
+
80
+ model_name = "LGAI-EXAONE/EXAONE-4.0-1.2B-GGUF"
81
+ tokenizer = AutoTokenizer.from_pretrained(model_name)
82
+
83
+ messages = [
84
+ {"role": "user", "content": "Let's work together on local system!"}
85
+ ]
86
+ input_text = tokenizer.apply_chat_template(
87
+ messages,
88
+ tokenize=False,
89
+ add_generation_prompt=True,
90
+ )
91
+
92
+ print(repr(input_text))
93
+ with open("inputs.txt", "w") as f:
94
+ f.write(input_text)
95
+ ```
96
+
97
+ 4. Generate result with greedy decoding.
98
+ ```bash
99
+ llama-cli -m EXAONE-4.0-1.2B-GGUF-Q4_K_M.gguf \
100
+ -fa -ngl 64 \
101
+ --temp 0.0 --top-k 1 \
102
+ -f inputs.txt -no-cnv
103
+ ```
104
+
105
+ </details>
106
+
107
+ <details>
108
+ <summary>OpenAI compatible server with `llama-server`</summary>
109
+
110
+ 3. Run llama-server with EXAONE 4.0 Jinja template.
111
+ ```bash
112
+ llama-server -m EXAONE-4.0-32B-Q4_K_M.gguf \
113
+ -c 131072 -fa -ngl 64 \
114
+ --temp 0.6 --top-p 0.95 \
115
+ --jinja --chat-template-format chat_template_simple.jinja \
116
+ --host 0.0.0.0 --port 8820 \
117
+ -a EXAONE-4.0-32B-Q4_K_M
118
+ ```
119
+
120
+ 4. Use OpenAI chat completion to test the GGUF model.
121
+ > The implementation of `llama.cpp` would not be optimized for some usage including reasoning mode or agentic use.
122
+
123
+ ```bash
124
+ curl -X POST http://localhost:8820/v1/chat/completions \
125
+ -H "Content-Type: application/json" \
126
+ -d '{
127
+ "model": "EXAONE-4.0-32B-Q4_K_M",
128
+ "messages": [
129
+ {"role": "user", "content": "Let'\''s work together on server!"}
130
+ ],
131
+ "max_tokens": 1024,
132
+ "temperature": 0.6,
133
+ "top_p": 0.95
134
+ }'
135
+ ```
136
+
137
+ </details>
138
+
139
+
140
+ ## Performance
141
+
142
+ The following tables show the evaluation results of each model, with reasoning and non-reasoning mode. The evaluation details can be found in the [technical report](https://www.lgresearch.ai/data/cdn/upload/EXAONE_4_0.pdf).
143
+
144
+ - ✅ denotes the model has a hybrid reasoning capability, evaluated by selecting reasoning / non-reasoning on the purpose.
145
+ - The evaluation results are based on the original model, not quantized model.
146
+
147
+
148
+ ### 32B Reasoning Mode
149
+
150
+ <table>
151
+ <tr>
152
+ <th> </th>
153
+ <th>EXAONE 4.0 32B </th>
154
+ <th>Phi 4 reasoning-plus</th>
155
+ <th>Magistral Small-2506</th>
156
+ <th>Qwen 3 32B </th>
157
+ <th>Qwen 3 235B </th>
158
+ <th>DeepSeek R1-0528</th>
159
+ </tr>
160
+ <tr>
161
+ <td align="center">Model Size</td>
162
+ <td align="center">32.0B</td>
163
+ <td align="center">14.7B</td>
164
+ <td align="center">23.6B</td>
165
+ <td align="center">32.8B</td>
166
+ <td align="center">235B</td>
167
+ <td align="center">671B</td>
168
+ </tr>
169
+ <tr>
170
+ <td align="center">Hybrid Reasoning</td>
171
+ <td align="center">✅</td>
172
+ <td align="center"> </td>
173
+ <td align="center"> </td>
174
+ <td align="center">✅</td>
175
+ <td align="center">✅</td>
176
+ <td align="center"> </td>
177
+ </tr>
178
+ <tr>
179
+ <td align="center" colspan='7'><i>World Knowledge</i></td>
180
+ </tr>
181
+ <tr>
182
+ <td >MMLU-Redux</td>
183
+ <td align="center">92.3</td>
184
+ <td align="center">90.8</td>
185
+ <td align="center">86.8</td>
186
+ <td align="center">90.9</td>
187
+ <td align="center">92.7</td>
188
+ <td align="center">93.4</td>
189
+ </tr>
190
+ <tr>
191
+ <td >MMLU-Pro</td>
192
+ <td align="center">81.8</td>
193
+ <td align="center">76.0</td>
194
+ <td align="center">73.4</td>
195
+ <td align="center">80.0</td>
196
+ <td align="center">83.0</td>
197
+ <td align="center">85.0</td>
198
+ </tr>
199
+ <tr>
200
+ <td >GPQA-Diamond</td>
201
+ <td align="center">75.4</td>
202
+ <td align="center">68.9</td>
203
+ <td align="center">68.2</td>
204
+ <td align="center">68.4</td>
205
+ <td align="center">71.1</td>
206
+ <td align="center">81.0</td>
207
+ </tr>
208
+ <tr>
209
+ <td align="center" colspan='7'><i>Math/Coding</i></td>
210
+ </tr>
211
+ <tr>
212
+ <td >AIME 2025</td>
213
+ <td align="center">85.3</td>
214
+ <td align="center">78.0</td>
215
+ <td align="center">62.8</td>
216
+ <td align="center">72.9</td>
217
+ <td align="center">81.5</td>
218
+ <td align="center">87.5</td>
219
+ </tr>
220
+ <tr>
221
+ <td >HMMT Feb 2025</td>
222
+ <td align="center">72.9</td>
223
+ <td align="center">53.6</td>
224
+ <td align="center">43.5</td>
225
+ <td align="center">50.4</td>
226
+ <td align="center">62.5</td>
227
+ <td align="center">79.4</td>
228
+ </tr>
229
+ <tr>
230
+ <td >LiveCodeBench v5</td>
231
+ <td align="center">72.6</td>
232
+ <td align="center">51.7</td>
233
+ <td align="center">55.8</td>
234
+ <td align="center">65.7</td>
235
+ <td align="center">70.7</td>
236
+ <td align="center">75.2</td>
237
+ </tr>
238
+ <tr>
239
+ <td >LiveCodeBench v6</td>
240
+ <td align="center">66.7</td>
241
+ <td align="center">47.1</td>
242
+ <td align="center">47.4</td>
243
+ <td align="center">60.1</td>
244
+ <td align="center">58.9</td>
245
+ <td align="center">70.3</td>
246
+ </tr>
247
+ <tr>
248
+ <td align="center" colspan='7'><i>Instruction Following</i></td>
249
+ </tr>
250
+ <tr>
251
+ <td >IFEval</td>
252
+ <td align="center">83.7</td>
253
+ <td align="center">84.9</td>
254
+ <td align="center">37.9</td>
255
+ <td align="center">85.0</td>
256
+ <td align="center">83.4</td>
257
+ <td align="center">80.8</td>
258
+ </tr>
259
+ <tr>
260
+ <td >Multi-IF (EN)</td>
261
+ <td align="center">73.5</td>
262
+ <td align="center">56.1</td>
263
+ <td align="center">27.4</td>
264
+ <td align="center">73.4</td>
265
+ <td align="center">73.4</td>
266
+ <td align="center">72.0</td>
267
+ </tr>
268
+ <tr>
269
+ <td align="center" colspan='7'><i>Agentic Tool Use</i></td>
270
+ </tr>
271
+ <tr>
272
+ <td >BFCL-v3</td>
273
+ <td align="center">63.9</td>
274
+ <td align="center">N/A</td>
275
+ <td align="center">40.4</td>
276
+ <td align="center">70.3</td>
277
+ <td align="center">70.8</td>
278
+ <td align="center">64.7</td>
279
+ </tr>
280
+ <tr>
281
+ <td >Tau-bench (Airline)</td>
282
+ <td align="center">51.5</td>
283
+ <td align="center">N/A</td>
284
+ <td align="center">38.5</td>
285
+ <td align="center">34.5</td>
286
+ <td align="center">37.5</td>
287
+ <td align="center">53.5</td>
288
+ </tr>
289
+ <tr>
290
+ <td >Tau-bench (Retail)</td>
291
+ <td align="center">62.8</td>
292
+ <td align="center">N/A</td>
293
+ <td align="center">10.2</td>
294
+ <td align="center">55.2</td>
295
+ <td align="center">58.3</td>
296
+ <td align="center">63.9</td>
297
+ </tr>
298
+ <tr>
299
+ <td align="center" colspan='7'><i>Multilinguality</i></td>
300
+ </tr>
301
+ <tr>
302
+ <td >KMMLU-Pro</td>
303
+ <td align="center">67.7</td>
304
+ <td align="center">55.8</td>
305
+ <td align="center">51.5</td>
306
+ <td align="center">61.4</td>
307
+ <td align="center">68.1</td>
308
+ <td align="center">71.7</td>
309
+ </tr>
310
+ <tr>
311
+ <td >KMMLU-Redux</td>
312
+ <td align="center">72.7</td>
313
+ <td align="center">62.7</td>
314
+ <td align="center">54.6</td>
315
+ <td align="center">67.5</td>
316
+ <td align="center">74.5</td>
317
+ <td align="center">77.0</td>
318
+ </tr>
319
+ <tr>
320
+ <td >KSM</td>
321
+ <td align="center">87.6</td>
322
+ <td align="center">79.8</td>
323
+ <td align="center">71.9</td>
324
+ <td align="center">82.8</td>
325
+ <td align="center">86.2</td>
326
+ <td align="center">86.7</td>
327
+ </tr>
328
+ <tr>
329
+ <td >MMMLU (ES)</td>
330
+ <td align="center">85.6</td>
331
+ <td align="center">84.3</td>
332
+ <td align="center">68.9</td>
333
+ <td align="center">82.8</td>
334
+ <td align="center">86.7</td>
335
+ <td align="center">88.2</td>
336
+ </tr>
337
+ <tr>
338
+ <td >MATH500 (ES)</td>
339
+ <td align="center">95.8</td>
340
+ <td align="center">94.2</td>
341
+ <td align="center">83.5</td>
342
+ <td align="center">94.3</td>
343
+ <td align="center">95.1</td>
344
+ <td align="center">96.0</td>
345
+ </tr>
346
+ </table>
347
+
348
+ ### 32B Non-Reasoning Mode
349
+
350
+ <table>
351
+ <tr>
352
+ <th> </th>
353
+ <th>EXAONE 4.0 32B </th>
354
+ <th>Phi 4</th>
355
+ <th>Mistral-Small-2506</th>
356
+ <th>Gemma 3 27B</th>
357
+ <th>Qwen3 32B </th>
358
+ <th>Qwen3 235B </th>
359
+ <th>Llama-4-Maverick</th>
360
+ <th>DeepSeek V3-0324</th>
361
+ </tr>
362
+ <tr>
363
+ <td align="center">Model Size</td>
364
+ <td align="center">32.0B</td>
365
+ <td align="center">14.7B</td>
366
+ <td align="center">24.0B</td>
367
+ <td align="center">27.4B</td>
368
+ <td align="center">32.8B</td>
369
+ <td align="center">235B</td>
370
+ <td align="center">402B</td>
371
+ <td align="center">671B</td>
372
+ </tr>
373
+ <tr>
374
+ <td align="center">Hybrid Reasoning</td>
375
+ <td align="center">✅</td>
376
+ <td align="center"> </td>
377
+ <td align="center"> </td>
378
+ <td align="center"> </td>
379
+ <td align="center">✅</td>
380
+ <td align="center">✅</td>
381
+ <td align="center"> </td>
382
+ <td align="center"> </td>
383
+ </tr>
384
+ <tr>
385
+ <td align="center" colspan='9'><i>World Knowledge</i></td>
386
+ </tr>
387
+ <tr>
388
+ <td >MMLU-Redux</td>
389
+ <td align="center">89.8</td>
390
+ <td align="center">88.3</td>
391
+ <td align="center">85.9</td>
392
+ <td align="center">85.0</td>
393
+ <td align="center">85.7</td>
394
+ <td align="center">89.2</td>
395
+ <td align="center">92.3</td>
396
+ <td align="center">92.3</td>
397
+ </tr>
398
+ <tr>
399
+ <td >MMLU-Pro</td>
400
+ <td align="center">77.6</td>
401
+ <td align="center">70.4</td>
402
+ <td align="center">69.1</td>
403
+ <td align="center">67.5</td>
404
+ <td align="center">74.4</td>
405
+ <td align="center">77.4</td>
406
+ <td align="center">80.5</td>
407
+ <td align="center">81.2</td>
408
+ </tr>
409
+ <tr>
410
+ <td >GPQA-Diamond</td>
411
+ <td align="center">63.7</td>
412
+ <td align="center">56.1</td>
413
+ <td align="center">46.1</td>
414
+ <td align="center">42.4</td>
415
+ <td align="center">54.6</td>
416
+ <td align="center">62.9</td>
417
+ <td align="center">69.8</td>
418
+ <td align="center">68.4</td>
419
+ </tr>
420
+ <tr>
421
+ <td align="center" colspan='9'><i>Math/Coding</i></td>
422
+ </tr>
423
+ <tr>
424
+ <td >AIME 2025</td>
425
+ <td align="center">35.9</td>
426
+ <td align="center">17.8</td>
427
+ <td align="center">30.2</td>
428
+ <td align="center">23.8</td>
429
+ <td align="center">20.2</td>
430
+ <td align="center">24.7</td>
431
+ <td align="center">18.0</td>
432
+ <td align="center">50.0</td>
433
+ </tr>
434
+ <tr>
435
+ <td >HMMT Feb 2025</td>
436
+ <td align="center">21.8</td>
437
+ <td align="center">4.0</td>
438
+ <td align="center">16.9</td>
439
+ <td align="center">10.3</td>
440
+ <td align="center">9.8</td>
441
+ <td align="center">11.9</td>
442
+ <td align="center">7.3</td>
443
+ <td align="center">29.2</td>
444
+ </tr>
445
+ <tr>
446
+ <td >LiveCodeBench v5</td>
447
+ <td align="center">43.3</td>
448
+ <td align="center">24.6</td>
449
+ <td align="center">25.8</td>
450
+ <td align="center">27.5</td>
451
+ <td align="center">31.3</td>
452
+ <td align="center">35.3</td>
453
+ <td align="center">43.4</td>
454
+ <td align="center">46.7</td>
455
+ </tr>
456
+ <tr>
457
+ <td >LiveCodeBench v6</td>
458
+ <td align="center">43.1</td>
459
+ <td align="center">27.4</td>
460
+ <td align="center">26.9</td>
461
+ <td align="center">29.7</td>
462
+ <td align="center">28.0</td>
463
+ <td align="center">31.4</td>
464
+ <td align="center">32.7</td>
465
+ <td align="center">44.0</td>
466
+ </tr>
467
+ <tr>
468
+ <td align="center" colspan='9'><i>Instruction Following</i></td>
469
+ </tr>
470
+ <tr>
471
+ <td >IFEval</td>
472
+ <td align="center">84.8</td>
473
+ <td align="center">63.0</td>
474
+ <td align="center">77.8</td>
475
+ <td align="center">82.6</td>
476
+ <td align="center">83.2</td>
477
+ <td align="center">83.2</td>
478
+ <td align="center">85.4</td>
479
+ <td align="center">81.2</td>
480
+ </tr>
481
+ <tr>
482
+ <td >Multi-IF (EN)</td>
483
+ <td align="center">71.6</td>
484
+ <td align="center">47.7</td>
485
+ <td align="center">63.2</td>
486
+ <td align="center">72.1</td>
487
+ <td align="center">71.9</td>
488
+ <td align="center">72.5</td>
489
+ <td align="center">77.9</td>
490
+ <td align="center">68.3</td>
491
+ </tr>
492
+ <tr>
493
+ <td align="center" colspan='9'><i>Long Context</i></td>
494
+ </tr>
495
+ <tr>
496
+ <td >HELMET</td>
497
+ <td align="center">58.3</td>
498
+ <td align="center">N/A</td>
499
+ <td align="center">61.9</td>
500
+ <td align="center">58.3</td>
501
+ <td align="center">54.5</td>
502
+ <td align="center">63.3</td>
503
+ <td align="center">13.7</td>
504
+ <td align="center">N/A</td>
505
+ </tr>
506
+ <tr>
507
+ <td >RULER</td>
508
+ <td align="center">88.2</td>
509
+ <td align="center">N/A</td>
510
+ <td align="center">71.8</td>
511
+ <td align="center">66.0</td>
512
+ <td align="center">85.6</td>
513
+ <td align="center">90.6</td>
514
+ <td align="center">2.9</td>
515
+ <td align="center">N/A</td>
516
+ </tr>
517
+ <tr>
518
+ <td >LongBench v1</td>
519
+ <td align="center">48.1</td>
520
+ <td align="center">N/A</td>
521
+ <td align="center">51.5</td>
522
+ <td align="center">51.5</td>
523
+ <td align="center">44.2</td>
524
+ <td align="center">45.3</td>
525
+ <td align="center">34.7</td>
526
+ <td align="center">N/A</td>
527
+ </tr>
528
+ <tr>
529
+ <td align="center" colspan='9'><i>Agentic Tool Use</i></td>
530
+ </tr>
531
+ <tr>
532
+ <td >BFCL-v3</td>
533
+ <td align="center">65.2</td>
534
+ <td align="center">N/A</td>
535
+ <td align="center">57.7</td>
536
+ <td align="center">N/A</td>
537
+ <td align="center">63.0</td>
538
+ <td align="center">68.0</td>
539
+ <td align="center">52.9</td>
540
+ <td align="center">63.8</td>
541
+ </tr>
542
+ <tr>
543
+ <td >Tau-Bench (Airline)</td>
544
+ <td align="center">25.5</td>
545
+ <td align="center">N/A</td>
546
+ <td align="center">36.1</td>
547
+ <td align="center">N/A</td>
548
+ <td align="center">16.0</td>
549
+ <td align="center">27.0</td>
550
+ <td align="center">38.0</td>
551
+ <td align="center">40.5</td>
552
+ </tr>
553
+ <tr>
554
+ <td >Tau-Bench (Retail)</td>
555
+ <td align="center">55.9</td>
556
+ <td align="center">N/A</td>
557
+ <td align="center">35.5</td>
558
+ <td align="center">N/A</td>
559
+ <td align="center">47.6</td>
560
+ <td align="center">56.5</td>
561
+ <td align="center">6.5</td>
562
+ <td align="center">68.5</td>
563
+ </tr>
564
+ <tr>
565
+ <td align="center" colspan='9'><i>Multilinguality</i></td>
566
+ </tr>
567
+ <tr>
568
+ <td >KMMLU-Pro</td>
569
+ <td align="center">60.0</td>
570
+ <td align="center">44.8</td>
571
+ <td align="center">51.0</td>
572
+ <td align="center">50.7</td>
573
+ <td align="center">58.3</td>
574
+ <td align="center">64.4</td>
575
+ <td align="center">68.8</td>
576
+ <td align="center">67.3</td>
577
+ </tr>
578
+ <tr>
579
+ <td >KMMLU-Redux</td>
580
+ <td align="center">64.8</td>
581
+ <td align="center">50.1</td>
582
+ <td align="center">53.6</td>
583
+ <td align="center">53.3</td>
584
+ <td align="center">64.4</td>
585
+ <td align="center">71.7</td>
586
+ <td align="center">76.9</td>
587
+ <td align="center">72.2</td>
588
+ </tr>
589
+ <tr>
590
+ <td >KSM</td>
591
+ <td align="center">59.8</td>
592
+ <td align="center">29.1</td>
593
+ <td align="center">35.5</td>
594
+ <td align="center">36.1</td>
595
+ <td align="center">41.3</td>
596
+ <td align="center">46.6</td>
597
+ <td align="center">40.6</td>
598
+ <td align="center">63.5</td>
599
+ </tr>
600
+ <tr>
601
+ <td >Ko-LongBench</td>
602
+ <td align="center">76.9</td>
603
+ <td align="center">N/A</td>
604
+ <td align="center">55.4</td>
605
+ <td align="center">72.0</td>
606
+ <td align="center">73.9</td>
607
+ <td align="center">74.6</td>
608
+ <td align="center">65.6</td>
609
+ <td align="center">N/A</td>
610
+ </tr>
611
+ <tr>
612
+ <td >MMMLU (ES)</td>
613
+ <td align="center">80.6</td>
614
+ <td align="center">81.2</td>
615
+ <td align="center">78.4</td>
616
+ <td align="center">78.7</td>
617
+ <td align="center">82.1</td>
618
+ <td align="center">83.7</td>
619
+ <td align="center">86.9</td>
620
+ <td align="center">86.7</td>
621
+ </tr>
622
+ <tr>
623
+ <td >MATH500 (ES)</td>
624
+ <td align="center">87.3</td>
625
+ <td align="center">78.2</td>
626
+ <td align="center">83.4</td>
627
+ <td align="center">86.8</td>
628
+ <td align="center">84.7</td>
629
+ <td align="center">87.2</td>
630
+ <td align="center">78.7</td>
631
+ <td align="center">89.2</td>
632
+ </tr>
633
+ <tr>
634
+ <td >WMT24++ (ES)</td>
635
+ <td align="center">90.7</td>
636
+ <td align="center">89.3</td>
637
+ <td align="center">92.2</td>
638
+ <td align="center">93.1</td>
639
+ <td align="center">91.4</td>
640
+ <td align="center">92.9</td>
641
+ <td align="center">92.7</td>
642
+ <td align="center">94.3 </td>
643
+ </tr>
644
+ </table>
645
+
646
+ ### 1.2B Reasoning Mode
647
+
648
+ <table>
649
+ <tr>
650
+ <th> </th>
651
+ <th>EXAONE 4.0 1.2B </th>
652
+ <th>EXAONE Deep 2.4B</th>
653
+ <th>Qwen 3 0.6B </th>
654
+ <th>Qwen 3 1.7B </th>
655
+ <th>SmolLM3 3B </th>
656
+ </tr>
657
+ <tr>
658
+ <td align="center">Model Size</td>
659
+ <td align="center">1.28B</td>
660
+ <td align="center">2.41B</td>
661
+ <td align="center">596M</td>
662
+ <td align="center">1.72B</td>
663
+ <td align="center">3.08B</td>
664
+ </tr>
665
+ <tr>
666
+ <td align="center">Hybrid Reasoning</td>
667
+ <td align="center">✅</td>
668
+ <td align="center"> </td>
669
+ <td align="center">✅</td>
670
+ <td align="center">✅</td>
671
+ <td align="center">✅</td>
672
+ </tr>
673
+ <tr>
674
+ <td align="center" colspan='6'><i>World Knowledge</i></td>
675
+ </tr>
676
+ <tr>
677
+ <td >MMLU-Redux</td>
678
+ <td align="center">71.5</td>
679
+ <td align="center">68.9</td>
680
+ <td align="center">55.6</td>
681
+ <td align="center">73.9</td>
682
+ <td align="center">74.8</td>
683
+ </tr>
684
+ <tr>
685
+ <td >MMLU-Pro</td>
686
+ <td align="center">59.3</td>
687
+ <td align="center">56.4</td>
688
+ <td align="center">38.3</td>
689
+ <td align="center">57.7</td>
690
+ <td align="center">57.8</td>
691
+ </tr>
692
+ <tr>
693
+ <td >GPQA-Diamond</td>
694
+ <td align="center">52.0</td>
695
+ <td align="center">54.3</td>
696
+ <td align="center">27.9</td>
697
+ <td align="center">40.1</td>
698
+ <td align="center">41.7</td>
699
+ </tr>
700
+ <tr>
701
+ <td align="center" colspan='6'><i>Math/Coding</i></td>
702
+ </tr>
703
+ <tr>
704
+ <td >AIME 2025</td>
705
+ <td align="center">45.2</td>
706
+ <td align="center">47.9</td>
707
+ <td align="center">15.1</td>
708
+ <td align="center">36.8</td>
709
+ <td align="center">36.7</td>
710
+ </tr>
711
+ <tr>
712
+ <td >HMMT Feb 2025</td>
713
+ <td align="center">34.0</td>
714
+ <td align="center">27.3</td>
715
+ <td align="center">7.0</td>
716
+ <td align="center">21.8</td>
717
+ <td align="center">26.0</td>
718
+ </tr>
719
+ <tr>
720
+ <td >LiveCodeBench v5</td>
721
+ <td align="center">44.6</td>
722
+ <td align="center">47.2</td>
723
+ <td align="center">12.3</td>
724
+ <td align="center">33.2</td>
725
+ <td align="center">27.6</td>
726
+ </tr>
727
+ <tr>
728
+ <td >LiveCodeBench v6</td>
729
+ <td align="center">45.3</td>
730
+ <td align="center">43.1</td>
731
+ <td align="center">16.4</td>
732
+ <td align="center">29.9</td>
733
+ <td align="center">29.1</td>
734
+ </tr>
735
+ <tr>
736
+ <td align="center" colspan='6'><i>Instruction Following</i></td>
737
+ </tr>
738
+ <tr>
739
+ <td >IFEval</td>
740
+ <td align="center">67.8</td>
741
+ <td align="center">71.0</td>
742
+ <td align="center">59.2</td>
743
+ <td align="center">72.5</td>
744
+ <td align="center">71.2</td>
745
+ </tr>
746
+ <tr>
747
+ <td >Multi-IF (EN)</td>
748
+ <td align="center">53.9</td>
749
+ <td align="center">54.5</td>
750
+ <td align="center">37.5</td>
751
+ <td align="center">53.5</td>
752
+ <td align="center">47.5</td>
753
+ </tr>
754
+ <tr>
755
+ <td align="center" colspan='6'><i>Agentic Tool Use</i></td>
756
+ </tr>
757
+ <tr>
758
+ <td >BFCL-v3</td>
759
+ <td align="center">52.9</td>
760
+ <td align="center">N/A</td>
761
+ <td align="center">46.4</td>
762
+ <td align="center">56.6</td>
763
+ <td align="center">37.1</td>
764
+ </tr>
765
+ <tr>
766
+ <td >Tau-Bench (Airline)</td>
767
+ <td align="center">20.5</td>
768
+ <td align="center">N/A</td>
769
+ <td align="center">22.0</td>
770
+ <td align="center">31.0</td>
771
+ <td align="center">37.0</td>
772
+ </tr>
773
+ <tr>
774
+ <td >Tau-Bench (Retail)</td>
775
+ <td align="center">28.1</td>
776
+ <td align="center">N/A</td>
777
+ <td align="center">3.3</td>
778
+ <td align="center">6.5</td>
779
+ <td align="center">5.4</td>
780
+ </tr>
781
+ <tr>
782
+ <td align="center" colspan='6'><i>Multilinguality</i></td>
783
+ </tr>
784
+ <tr>
785
+ <td >KMMLU-Pro</td>
786
+ <td align="center">42.7</td>
787
+ <td align="center">24.6</td>
788
+ <td align="center">21.6</td>
789
+ <td align="center">38.3</td>
790
+ <td align="center">30.5</td>
791
+ </tr>
792
+ <tr>
793
+ <td >KMMLU-Redux</td>
794
+ <td align="center">46.9</td>
795
+ <td align="center">25.0</td>
796
+ <td align="center">24.5</td>
797
+ <td align="center">38.0</td>
798
+ <td align="center">33.7</td>
799
+ </tr>
800
+ <tr>
801
+ <td >KSM</td>
802
+ <td align="center">60.6</td>
803
+ <td align="center">60.9</td>
804
+ <td align="center">22.8</td>
805
+ <td align="center">52.9</td>
806
+ <td align="center">49.7</td>
807
+ </tr>
808
+ <tr>
809
+ <td >MMMLU (ES)</td>
810
+ <td align="center">62.4</td>
811
+ <td align="center">51.4</td>
812
+ <td align="center">48.8</td>
813
+ <td align="center">64.5</td>
814
+ <td align="center">64.7</td>
815
+ </tr>
816
+ <tr>
817
+ <td >MATH500 (ES)</td>
818
+ <td align="center">88.8</td>
819
+ <td align="center">84.5</td>
820
+ <td align="center">70.6</td>
821
+ <td align="center">87.9</td>
822
+ <td align="center">87.5 </td>
823
+ </tr>
824
+ </table>
825
+
826
+ ### 1.2B Non-Reasoning Mode
827
+
828
+ <table>
829
+ <tr>
830
+ <th> </th>
831
+ <th>EXAONE 4.0 1.2B </th>
832
+ <th>Qwen 3 0.6B </th>
833
+ <th>Gemma 3 1B</th>
834
+ <th>Qwen 3 1.7B </th>
835
+ <th>SmolLM3 3B </th>
836
+ </tr>
837
+ <tr>
838
+ <td align="center">Model Size</td>
839
+ <td align="center">1.28B</td>
840
+ <td align="center">596M</td>
841
+ <td align="center">1.00B</td>
842
+ <td align="center">1.72B</td>
843
+ <td align="center">3.08B</td>
844
+ </tr>
845
+ <tr>
846
+ <td align="center">Hybrid Reasoning</td>
847
+ <td align="center">✅</td>
848
+ <td align="center">✅</td>
849
+ <td align="center"> </td>
850
+ <td align="center">✅</td>
851
+ <td align="center">✅</td>
852
+ </tr>
853
+ <tr>
854
+ <td align="center" colspan='6'><i>World Knowledge</i></td>
855
+ </tr>
856
+ <tr>
857
+ <td >MMLU-Redux</td>
858
+ <td align="center">66.9</td>
859
+ <td align="center">44.6</td>
860
+ <td align="center">40.9</td>
861
+ <td align="center">63.4</td>
862
+ <td align="center">65.0</td>
863
+ </tr>
864
+ <tr>
865
+ <td >MMLU-Pro</td>
866
+ <td align="center">52.0</td>
867
+ <td align="center">26.6</td>
868
+ <td align="center">14.7</td>
869
+ <td align="center">43.7</td>
870
+ <td align="center">43.6</td>
871
+ </tr>
872
+ <tr>
873
+ <td >GPQA-Diamond</td>
874
+ <td align="center">40.1</td>
875
+ <td align="center">22.9</td>
876
+ <td align="center">19.2</td>
877
+ <td align="center">28.6</td>
878
+ <td align="center">35.7</td>
879
+ </tr>
880
+ <tr>
881
+ <td align="center" colspan='6'><i>Math/Coding</i></td>
882
+ </tr>
883
+ <tr>
884
+ <td >AIME 2025</td>
885
+ <td align="center">23.5</td>
886
+ <td align="center">2.6</td>
887
+ <td align="center">2.1</td>
888
+ <td align="center">9.8</td>
889
+ <td align="center">9.3</td>
890
+ </tr>
891
+ <tr>
892
+ <td >HMMT Feb 2025</td>
893
+ <td align="center">13.0</td>
894
+ <td align="center">1.0</td>
895
+ <td align="center">1.5</td>
896
+ <td align="center">5.1</td>
897
+ <td align="center">4.7</td>
898
+ </tr>
899
+ <tr>
900
+ <td >LiveCodeBench v5</td>
901
+ <td align="center">26.4</td>
902
+ <td align="center">3.6</td>
903
+ <td align="center">1.8</td>
904
+ <td align="center">11.6</td>
905
+ <td align="center">11.4</td>
906
+ </tr>
907
+ <tr>
908
+ <td >LiveCodeBench v6</td>
909
+ <td align="center">30.1</td>
910
+ <td align="center">6.9</td>
911
+ <td align="center">2.3</td>
912
+ <td align="center">16.6</td>
913
+ <td align="center">20.6</td>
914
+ </tr>
915
+ <tr>
916
+ <td align="center" colspan='6'><i>Instruction Following</i></td>
917
+ </tr>
918
+ <tr>
919
+ <td >IFEval</td>
920
+ <td align="center">74.7</td>
921
+ <td align="center">54.5</td>
922
+ <td align="center">80.2</td>
923
+ <td align="center">68.2</td>
924
+ <td align="center">76.7</td>
925
+ </tr>
926
+ <tr>
927
+ <td >Multi-IF (EN)</td>
928
+ <td align="center">62.1</td>
929
+ <td align="center">37.5</td>
930
+ <td align="center">32.5</td>
931
+ <td align="center">51.0</td>
932
+ <td align="center">51.9</td>
933
+ </tr>
934
+ <tr>
935
+ <td align="center" colspan='6'><i>Long Context</i></td>
936
+ </tr>
937
+ <tr>
938
+ <td >HELMET</td>
939
+ <td align="center">41.2</td>
940
+ <td align="center">21.1</td>
941
+ <td align="center">N/A</td>
942
+ <td align="center">33.8</td>
943
+ <td align="center">38.6</td>
944
+ </tr>
945
+ <tr>
946
+ <td >RULER</td>
947
+ <td align="center">77.4</td>
948
+ <td align="center">55.1</td>
949
+ <td align="center">N/A</td>
950
+ <td align="center">65.9</td>
951
+ <td align="center">66.3</td>
952
+ </tr>
953
+ <tr>
954
+ <td >LongBench v1</td>
955
+ <td align="center">36.9</td>
956
+ <td align="center">32.4</td>
957
+ <td align="center">N/A</td>
958
+ <td align="center">41.9</td>
959
+ <td align="center">39.9</td>
960
+ </tr>
961
+ <tr>
962
+ <td align="center" colspan='6'><i>Agentic Tool Use</i></td>
963
+ </tr>
964
+ <tr>
965
+ <td >BFCL-v3</td>
966
+ <td align="center">55.7</td>
967
+ <td align="center">44.1</td>
968
+ <td align="center">N/A</td>
969
+ <td align="center">52.2</td>
970
+ <td align="center">47.3</td>
971
+ </tr>
972
+ <tr>
973
+ <td >Tau-Bench (Airline)</td>
974
+ <td align="center">10.0</td>
975
+ <td align="center">31.5</td>
976
+ <td align="center">N/A</td>
977
+ <td align="center">13.5</td>
978
+ <td align="center">38.0</td>
979
+ </tr>
980
+ <tr>
981
+ <td >Tau-Bench (Retail)</td>
982
+ <td align="center">21.7</td>
983
+ <td align="center">5.7</td>
984
+ <td align="center">N/A</td>
985
+ <td align="center">4.6</td>
986
+ <td align="center">6.7</td>
987
+ </tr>
988
+ <tr>
989
+ <td align="center" colspan='6'><i>Multilinguality</i></td>
990
+ </tr>
991
+ <tr>
992
+ <td >KMMLU-Pro</td>
993
+ <td align="center">37.5</td>
994
+ <td align="center">24.6</td>
995
+ <td align="center">9.7</td>
996
+ <td align="center">29.5</td>
997
+ <td align="center">27.6</td>
998
+ </tr>
999
+ <tr>
1000
+ <td >KMMLU-Redux</td>
1001
+ <td align="center">40.4</td>
1002
+ <td align="center">22.8</td>
1003
+ <td align="center">19.4</td>
1004
+ <td align="center">29.8</td>
1005
+ <td align="center">26.4</td>
1006
+ </tr>
1007
+ <tr>
1008
+ <td >KSM</td>
1009
+ <td align="center">26.3</td>
1010
+ <td align="center">0.1</td>
1011
+ <td align="center">22.8</td>
1012
+ <td align="center">16.3</td>
1013
+ <td align="center">16.1</td>
1014
+ </tr>
1015
+ <tr>
1016
+ <td >Ko-LongBench</td>
1017
+ <td align="center">69.8</td>
1018
+ <td align="center">16.4</td>
1019
+ <td align="center">N/A</td>
1020
+ <td align="center">57.1</td>
1021
+ <td align="center">15.7</td>
1022
+ </tr>
1023
+ <tr>
1024
+ <td >MMMLU (ES)</td>
1025
+ <td align="center">54.6</td>
1026
+ <td align="center">39.5</td>
1027
+ <td align="center">35.9</td>
1028
+ <td align="center">54.3</td>
1029
+ <td align="center">55.1</td>
1030
+ </tr>
1031
+ <tr>
1032
+ <td >MATH500 (ES)</td>
1033
+ <td align="center">71.2</td>
1034
+ <td align="center">38.5</td>
1035
+ <td align="center">41.2</td>
1036
+ <td align="center">66.0</td>
1037
+ <td align="center">62.4</td>
1038
+ </tr>
1039
+ <tr>
1040
+ <td >WMT24++ (ES)</td>
1041
+ <td align="center">65.9</td>
1042
+ <td align="center">58.2</td>
1043
+ <td align="center">76.9</td>
1044
+ <td align="center">76.7</td>
1045
+ <td align="center">84.0 </td>
1046
+ </tr>
1047
+ </table>
1048
+
1049
+
1050
+
1051
+ ## Usage Guideline
1052
+
1053
+ > [!IMPORTANT]
1054
+ > To achieve the expected performance, we recommend using the following configurations:
1055
+ >
1056
+ > - For non-reasoning mode, we recommend using a lower temperature value such as `temperature<0.6` for better performance.
1057
+ > - For reasoning mode (using `<think>` block), we recommend using `temperature=0.6` and `top_p=0.95`.
1058
+ > - If you suffer from the model degeneration, we recommend using `presence_penalty=1.5`.
1059
+ > - For Korean general conversation with 1.2B model, we suggest to use `temperature=0.1` to avoid code switching.
1060
+
1061
+
1062
+ ## Limitation
1063
+
1064
+ The EXAONE language model has certain limitations and may occasionally generate inappropriate responses. The language model generates responses based on the output probability of tokens, and it is determined during learning from training data. While we have made every effort to exclude personal, harmful, and biased information from the training data, some problematic content may still be included, potentially leading to undesirable responses. Please note that the text generated by EXAONE language model does not reflect the views of LG AI Research.
1065
+
1066
+ - Inappropriate answers may be generated, which contain personal, harmful or other inappropriate information.
1067
+ - Biased responses may be generated, which are associated with age, gender, race, and so on.
1068
+ - The generated responses rely heavily on statistics from the training data, which can result in the generation of
1069
+ semantically or syntactically incorrect sentences.
1070
+ - Since the model does not reflect the latest information, the responses may be false or contradictory.
1071
+
1072
+ LG AI Research strives to reduce potential risks that may arise from EXAONE language models. Users are not allowed
1073
+ to engage in any malicious activities (e.g., keying in illegal information) that may induce the creation of inappropriate
1074
+ outputs violating LG AI's ethical principles when using EXAONE language models.
1075
+
1076
+
1077
+ ## License
1078
+
1079
+ The model is licensed under [EXAONE AI Model License Agreement 1.2 - NC](./LICENSE)
1080
+
1081
+ > [!NOTE]
1082
+ > The main difference from the older version is as below:
1083
+ > - We removed **the claim of model output ownership** from the license.
1084
+ > - We restrict the model use **against the development of models that compete with EXAONE**.
1085
+ > - We allow the model to be used for **educational purposes**, not just research.
1086
+
1087
+
1088
+ ## Citation
1089
+
1090
+ TBD
1091
+
1092
+
1093
+ ## Contact
1094
+
1095
+ LG AI Research Technical Support: [email protected]
assets/EXAONE_Symbol+BI_3d.png ADDED

Git LFS Details

  • SHA256: c473c63768e9303c02a4f968fd2e7d41df3f669fedc6a7b51c4398cfcd7f23e4
  • Pointer size: 131 Bytes
  • Size of remote file: 249 kB