"batch_size": 1,
"fine_tune_type": "lora",
"grad_checkpoint": true,
"iters": 1000,
"learning_rate": 2e-05,
"lora_parameters": {
"rank": 8,
"dropout": 0.0,
"scale": 20.0
"test_loss": 1.017,
"Test ppl": 2.764
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
🙋
Ask for provider support