neural-chat-7B-v3-3-AWQ / quant_config.json
TheBloke's picture
AWQ model commit
fae745c
raw
history blame contribute delete
124 Bytes
{
"zero_point": true,
"q_group_size": 128,
"w_bit": 4,
"version": "GEMM",
"modules_to_not_convert": []
}