Add conversational tag
1
#11 opened 5 months ago
by
celinah

llama.cpp doesn't support this model, how can I convert safetensors model to bin and load in ollama
#10 opened 5 months ago
by
shuminzhou26803586
Update chat_template.json to incorporate `generation` tag
#9 opened 7 months ago
by
zjysteven

RuntimeError: Could not infer dtype of numpy.float32 when converting to PyTorch tensor
1
#8 opened 7 months ago
by
Koshti10
shape mismatch error during inference with finetuned Model
6
#7 opened 9 months ago
by
mdmev
Why no chat template like non-chatty has?
#5 opened 10 months ago
by
pseudotensor

How to merge an adapter to the base model
1
#4 opened 10 months ago
by
alielfilali01

How to deploy on inference endpoints?
2
#3 opened 10 months ago
by
brianjking
Update README.md
#2 opened 10 months ago
by
Alexander70
[Question] question about hyperparameter
1
#1 opened 10 months ago
by
Lala-chick