mpt_7b_chat-dense_quant_linearW8A8MatMul8Embeds8LMhead8

import deepsparse
from huggingface_hub import snapshot_download

MODEL_PATH = snapshot_download(repo_id="mgoin/mpt-7b-chat-quant")
model = deepsparse.Pipeline.create(task="text-generation", model_path=MODEL_PATH)
model(sequences="Tell me a joke.")
Downloads last month
17
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support