chansung/merged_ds_coding
Viewer • Updated • 60.6k • 133 • 18
How to use chansung/coding_llamaduo_result1 with PEFT:
from peft import PeftModel
from transformers import AutoModelForCausalLM
base_model = AutoModelForCausalLM.from_pretrained("google/gemma-7b")
model = PeftModel.from_pretrained(base_model, "chansung/coding_llamaduo_result1")This model is a fine-tuned version of google/gemma-7b on the chansung/merged_ds_coding dataset. It achieves the following results on the evaluation set:
More information needed
More information needed
More information needed
The following hyperparameters were used during training:
| Training Loss | Epoch | Step | Validation Loss |
|---|---|---|---|
| 1.5404 | 0.99 | 36 | 1.5048 |
| 0.9147 | 2.0 | 73 | 1.2327 |
| 0.7658 | 2.99 | 109 | 1.1766 |
| 0.6657 | 4.0 | 146 | 1.1664 |
| 0.5601 | 4.93 | 180 | 1.1871 |
Base model
google/gemma-7b