winglian commited on
Commit
3bd6d45
·
1 Parent(s): a91756e

Create README.md

Browse files
Files changed (1) hide show
  1. README.md +54 -0
README.md ADDED
@@ -0,0 +1,54 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ datasets:
3
+ - Open-Orca/OpenOrca
4
+ library_name: transformers
5
+ tags:
6
+ - llama
7
+ ---
8
+ # Basilisk 4B
9
+
10
+ [<img src="https://raw.githubusercontent.com/OpenAccess-AI-Collective/axolotl/main/image/axolotl-badge-web.png" alt="Built with Axolotl" width="200" height="32"/>](https://github.com/OpenAccess-AI-Collective/axolotl)
11
+
12
+ Built on `winglian/llama-2-4b`, a 4B parameter Llama-2 model, this model is finetuned with open orca CoT data.
13
+
14
+
15
+ ```
16
+ hf-causal-experimental (pretrained=winglian/basilisk-4b,use_accelerate=True,trust_remote_code=True), limit: None, provide_description: False, num_fewshot: 0, batch_size: None
17
+ | Task |Version| Metric |Value | |Stderr|
18
+ |------------------------------------------------|------:|---------------------|-----:|---|-----:|
19
+ |agieval_aqua_rat | 0|acc |0.2362|_ |0.0267|
20
+ | | |acc_norm |0.2283|_ |0.0264|
21
+ |agieval_logiqa_en | 0|acc |0.2688|_ |0.0174|
22
+ | | |acc_norm |0.2811|_ |0.0176|
23
+ |agieval_lsat_ar | 0|acc |0.2130|_ |0.0271|
24
+ | | |acc_norm |0.1913|_ |0.0260|
25
+ |agieval_lsat_lr | 0|acc |0.2255|_ |0.0185|
26
+ | | |acc_norm |0.2745|_ |0.0198|
27
+ |agieval_lsat_rc | 0|acc |0.2305|_ |0.0257|
28
+ | | |acc_norm |0.2491|_ |0.0264|
29
+ |agieval_sat_en | 0|acc |0.3641|_ |0.0336|
30
+ | | |acc_norm |0.3495|_ |0.0333|
31
+ |agieval_sat_en_without_passage | 0|acc |0.2427|_ |0.0299|
32
+ | | |acc_norm |0.2427|_ |0.0299|
33
+ |agieval_sat_math | 0|acc |0.2318|_ |0.0285|
34
+ | | |acc_norm |0.2091|_ |0.0275|
35
+ |bigbench_causal_judgement | 0|multiple_choice_grade|0.5000|_ |0.0364|
36
+ |bigbench_date_understanding | 0|multiple_choice_grade|0.3930|_ |0.0255|
37
+ |bigbench_disambiguation_qa | 0|multiple_choice_grade|0.2674|_ |0.0276|
38
+ |bigbench_geometric_shapes | 0|multiple_choice_grade|0.1838|_ |0.0205|
39
+ | | |exact_str_match |0.0279|_ |0.0087|
40
+ |bigbench_logical_deduction_five_objects | 0|multiple_choice_grade|0.2380|_ |0.0191|
41
+ |bigbench_logical_deduction_seven_objects | 0|multiple_choice_grade|0.1843|_ |0.0147|
42
+ |bigbench_logical_deduction_three_objects | 0|multiple_choice_grade|0.3800|_ |0.0281|
43
+ |bigbench_movie_recommendation | 0|multiple_choice_grade|0.3480|_ |0.0213|
44
+ |bigbench_navigate | 0|multiple_choice_grade|0.5000|_ |0.0158|
45
+ |bigbench_reasoning_about_colored_objects | 0|multiple_choice_grade|0.3680|_ |0.0108|
46
+ |bigbench_ruin_names | 0|multiple_choice_grade|0.2746|_ |0.0211|
47
+ |bigbench_salient_translation_error_detection | 0|multiple_choice_grade|0.2806|_ |0.0142|
48
+ |bigbench_snarks | 0|multiple_choice_grade|0.4972|_ |0.0373|
49
+ |bigbench_sports_understanding | 0|multiple_choice_grade|0.4939|_ |0.0159|
50
+ |bigbench_temporal_sequences | 0|multiple_choice_grade|0.2740|_ |0.0141|
51
+ |bigbench_tracking_shuffled_objects_five_objects | 0|multiple_choice_grade|0.1904|_ |0.0111|
52
+ |bigbench_tracking_shuffled_objects_seven_objects| 0|multiple_choice_grade|0.1394|_ |0.0083|
53
+ |bigbench_tracking_shuffled_objects_three_objects| 0|multiple_choice_grade|0.3800|_ |0.0281|
54
+ ```