Quick start

from ctransformers import AutoModelForCausalLM

model = AutoModelForCausalLM.from_pretrained("npvinHnivqn/GGUF-metamath-llemma", 
                                                   model_file="metamath-llemma.gguf", 
                                                   model_type="llama", gpu_layers=0,
                                                   context_length=768)
model('''AI will ''', temperature=0.1)
Downloads last month
2
GGUF
Model size
6.74B params
Architecture
llama
Hardware compatibility
Log In to view the estimation
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support