merged_fa_ties_Dorna_Llama3_8B_Instruct_v0_20250606

This model was created by merging multilingual language models using the TIES method.

Model Details

  • Method: TIES
  • Language: fa
  • Base Models:
    • Language Model: unknown
    • Knowledge Model: unknown
  • Created: 2025-06-07

Usage

from transformers import AutoModelForCausalLM, AutoTokenizer

model = AutoModelForCausalLM.from_pretrained("NTIS/merged_fa_ties_Dorna_Llama3_8B_Instruct_v0_20250606")
tokenizer = AutoTokenizer.from_pretrained("NTIS/merged_fa_ties_Dorna_Llama3_8B_Instruct_v0_20250606")

Training Details

This model was created using model merging techniques to combine the language capabilities of specialized multilingual models with general knowledge models.

Limitations

  • This model inherits limitations from its base models
  • Performance may vary across different languages and tasks
  • Requires evaluation for specific use cases

Citation

@misc{merged_fa_ties_Dorna_Llama3_8B_Instruct_v0_20250606,
  title={Multilingual Model Merging: merged_fa_ties_Dorna_Llama3_8B_Instruct_v0_20250606},
  year={2025},
  url={https://huggingface.co/NTIS/merged_fa_ties_Dorna_Llama3_8B_Instruct_v0_20250606}
}
Downloads last month
1
Safetensors
Model size
8.03B params
Tensor type
BF16
ยท
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support