Hugging Face
Models
Datasets
Spaces
Community
Docs
Enterprise
Pricing
Log In
Sign Up
RedMist137
/
DPO-Zephyr-7B
like
0
Safetensors
mistral
trl
dpo
Generated from Trainer
Model card
Files
Files and versions
xet
Community
main
DPO-Zephyr-7B
/
merges.txt
Commit History
Training in progress, step 100
85cbd5e
verified
RedMist137
commited on
Oct 17, 2024