The mixtral config was mixed, so I removed it.Because convert of llama.cpp failed. "num_local_experts": 8, "num_experts_per_tok": 2,
@mmnga Done! Thank you for spotting that!
· Sign up or log in to comment