--- base_model: - IntervitensInc/Mistral-Nemo-Base-2407-chatml library_name: transformers tags: - mergekit - merge --- # GodSlayer-12B-ABYSS This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit). ## Merge Details ### Merge Method This model was merged using the NuSLERP merge method using [IntervitensInc/Mistral-Nemo-Base-2407-chatml](https://huggingface.co/IntervitensInc/Mistral-Nemo-Base-2407-chatml) as a base. ### Models Merged The following models were included in the merge: * /home/redrix/Documents/LLM-Merging/p1 * /home/redrix/Documents/LLM-Merging/p2 ### Configuration The following YAML configuration was used to produce this model: ```yaml models: - model: /home/redrix/Documents/LLM-Merging/p1 parameters: weight: 0.5 - model: /home/redrix/Documents/LLM-Merging/p2 parameters: weight: 0.5 base_model: IntervitensInc/Mistral-Nemo-Base-2407-chatml merge_method: nuslerp dtype: bfloat16 chat_template: "chatml" tokenizer: source: union parameters: normalize: true int8_mask: true ```