lhoestq HF Staff commited on
Commit
20325df
·
verified ·
1 Parent(s): f565e5e

Add 'middle_school_biology' config data files

Browse files
README.md CHANGED
@@ -282,6 +282,14 @@ configs:
282
  path: metrology_engineer/val-*
283
  - split: dev
284
  path: metrology_engineer/dev-*
 
 
 
 
 
 
 
 
285
  dataset_info:
286
  - config_name: accountant
287
  features:
@@ -1303,6 +1311,36 @@ dataset_info:
1303
  num_examples: 5
1304
  download_size: 55033
1305
  dataset_size: 56085
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1306
  ---
1307
 
1308
  C-Eval is a comprehensive Chinese evaluation suite for foundation models. It consists of 13948 multi-choice questions spanning 52 diverse disciplines and four difficulty levels. Please visit our [website](https://cevalbenchmark.com/) and [GitHub](https://github.com/SJTU-LIT/ceval/tree/main) or check our [paper](https://arxiv.org/abs/2305.08322) for more details.
 
282
  path: metrology_engineer/val-*
283
  - split: dev
284
  path: metrology_engineer/dev-*
285
+ - config_name: middle_school_biology
286
+ data_files:
287
+ - split: test
288
+ path: middle_school_biology/test-*
289
+ - split: val
290
+ path: middle_school_biology/val-*
291
+ - split: dev
292
+ path: middle_school_biology/dev-*
293
  dataset_info:
294
  - config_name: accountant
295
  features:
 
1311
  num_examples: 5
1312
  download_size: 55033
1313
  dataset_size: 56085
1314
+ - config_name: middle_school_biology
1315
+ features:
1316
+ - name: id
1317
+ dtype: int32
1318
+ - name: question
1319
+ dtype: string
1320
+ - name: A
1321
+ dtype: string
1322
+ - name: B
1323
+ dtype: string
1324
+ - name: C
1325
+ dtype: string
1326
+ - name: D
1327
+ dtype: string
1328
+ - name: answer
1329
+ dtype: string
1330
+ - name: explanation
1331
+ dtype: string
1332
+ splits:
1333
+ - name: test
1334
+ num_bytes: 47264
1335
+ num_examples: 192
1336
+ - name: val
1337
+ num_bytes: 5263
1338
+ num_examples: 21
1339
+ - name: dev
1340
+ num_bytes: 4327
1341
+ num_examples: 5
1342
+ download_size: 58872
1343
+ dataset_size: 56854
1344
  ---
1345
 
1346
  C-Eval is a comprehensive Chinese evaluation suite for foundation models. It consists of 13948 multi-choice questions spanning 52 diverse disciplines and four difficulty levels. Please visit our [website](https://cevalbenchmark.com/) and [GitHub](https://github.com/SJTU-LIT/ceval/tree/main) or check our [paper](https://arxiv.org/abs/2305.08322) for more details.
middle_school_biology/dev-00000-of-00001.parquet ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:ccfa8a7260f4bd6cdf96dfdbda574ffcf2fc0939afaa95d6aa0f29ce198664df
3
+ size 10893
middle_school_biology/test-00000-of-00001.parquet ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:ae832369316dc1ada2df37c65db34d14ce953176bd1b62406890e8cc6388ff00
3
+ size 39033
middle_school_biology/val-00000-of-00001.parquet ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:a2e835dc5d6d410ca5aa9786360cfaab12aa0a1d7ed806d9c9964625c3c42f85
3
+ size 8946