xk-huang commited on
Commit
edba92b
·
verified ·
1 Parent(s): f21ad87

add head_dim

Browse files

Fix bug of vllm version 0.8.5:
```
self.q_size = self.num_heads * self.head_dim
TypeError: unsupported operand type(s) for *: 'int' and 'NoneType'
```

Files changed (1) hide show
  1. config.json +1 -1
config.json CHANGED
@@ -14,7 +14,7 @@
14
  "MistralForCausalLM"
15
  ],
16
  "attention_dropout": 0.0,
17
- "head_dim": null,
18
  "hidden_act": "silu",
19
  "hidden_size": 4096,
20
  "initializer_range": 0.02,
 
14
  "MistralForCausalLM"
15
  ],
16
  "attention_dropout": 0.0,
17
+ "head_dim": 128,
18
  "hidden_act": "silu",
19
  "hidden_size": 4096,
20
  "initializer_range": 0.02,