Hugging Face
Models
Datasets
Spaces
Community
Docs
Enterprise
Pricing
Log In
Sign Up
JayHyeon
/
gemma-BDPO_2e-7-1ep_0alp_0.5bdpo_lam_0dpop_lam
like
0
Model card
Files
Files and versions
Community
main
gemma-BDPO_2e-7-1ep_0alp_0.5bdpo_lam_0dpop_lam
Ctrl+K
Ctrl+K
1 contributor
History:
1 commit
JayHyeon
initial commit
635c1fc
verified
about 1 month ago
.gitattributes
Safe
1.52 kB
initial commit
about 1 month ago