DavidAU commited on
Commit
d58f204
·
verified ·
1 Parent(s): 1ca73ac

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +2 -2
README.md CHANGED
@@ -51,7 +51,7 @@ pipeline_tag: text-generation
51
 
52
  ( Uploading, Quants pending, 4 examples added )
53
 
54
- <h2>Qwen3-23B-A3B-The-Harley-Quinn</h2>
55
 
56
  <img src="qwen3-harley-quinn-23b.webp" style="float:right; width:300px; height:300px; padding:10px;">
57
 
@@ -62,7 +62,7 @@ ABOUT:
62
 
63
  A stranger, yet radically different version of Kalmaze's "Qwen/Qwen3-16B-A3B" with the
64
  experts pruned to 64 (from 128, the Qwen 3 30B-A3B version) and then I added 19 layers expanding (Brainstorm 20x by DavidAU
65
- info at bottom of this page) the model to 23B total parameters.
66
 
67
  The goal: slightly alter the model, to address some odd creative thinking and output choices.
68
 
 
51
 
52
  ( Uploading, Quants pending, 4 examples added )
53
 
54
+ <h2>Qwen3-22B-A3B-The-Harley-Quinn</h2>
55
 
56
  <img src="qwen3-harley-quinn-23b.webp" style="float:right; width:300px; height:300px; padding:10px;">
57
 
 
62
 
63
  A stranger, yet radically different version of Kalmaze's "Qwen/Qwen3-16B-A3B" with the
64
  experts pruned to 64 (from 128, the Qwen 3 30B-A3B version) and then I added 19 layers expanding (Brainstorm 20x by DavidAU
65
+ info at bottom of this page) the model to 22B total parameters.
66
 
67
  The goal: slightly alter the model, to address some odd creative thinking and output choices.
68