Uploaded model
Developed by: Onuii
License: apache-2.0
Finetuned from model : Onuii/DAMI-base-merge-0408
--run_name "DAMI-lora-sft-digging-0409"
--model_name "Onuii/DAMI-base-merge-0408"
--dataset_name "Onuii/DAMI-lecture-DB-Dataset-0408-digging"
--max_seq_length 8192
--dtype "bfloat16"
--load_in_4bit False
--r 16
--lora_alpha 32
--lora_dropout 0.1
--use_rslora True
--batch_size 16
--gradient_accumulation_steps 2
--learning_rate 1e-5
--warmup_ratio 0.05
--num_train_epochs 2
--optimizer "adamw_8bit"
--weight_decay 0.01
--lr_scheduler_type "linear"
--output_dir "../outputs"
--logging_dir "../logs"
--logging_steps 1
--save_steps 200
--save_total_limit 2
--report_to "wandb"
--use_gradient_checkpointing False
--save_strategy "steps"
--random_seed 562
This llama model was trained 2x faster with Unsloth and Huggingface's TRL library.
- Downloads last month
- 6
Model tree for Onuii/kanana-cpt-qlora-sft-lora-digging-250409
Base model
kakaocorp/kanana-nano-2.1b-base