Hugging Face
Models
Datasets
Spaces
Community
Docs
Enterprise
Pricing
Log In
Sign Up
DuongTrongChi
/
qwen-dpo-backup
like
0
Safetensors
qwen2
Model card
Files
Files and versions
Community
main
qwen-dpo-backup
/
config.json
Commit History
Upload folder using huggingface_hub
e7fb8e7
verified
DuongTrongChi
commited on
May 5