--- base_model: - anthracite-org/magnum-12b-v2 - unsloth/Mistral-Nemo-Instruct-2407 - IntervitensInc/Mistral-Nemo-Base-2407-chatml library_name: transformers tags: - mergekit - merge license: cc-by-nc-4.0 --- # o8 This model was meant as a component model for a larger merge but is usable as is. # ChatML prefered, works with Mistral Instruct, kinda works with everything # Tested using 0.75 temp, 0.10 MinP, 0.30 Smoothing Factor This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit). ## Merge Details ### Merge Method This model was merged using the [TIES](https://arxiv.org/abs/2306.01708) merge method using [IntervitensInc/Mistral-Nemo-Base-2407-chatml](https://huggingface.co/IntervitensInc/Mistral-Nemo-Base-2407-chatml) as a base. ### Models Merged The following models were included in the merge: * [anthracite-org/magnum-12b-v2](https://huggingface.co/anthracite-org/magnum-12b-v2) * [unsloth/Mistral-Nemo-Instruct-2407](https://huggingface.co/unsloth/Mistral-Nemo-Instruct-2407) ### Configuration The following YAML configuration was used to produce this model: ```yaml models: - model: IntervitensInc/Mistral-Nemo-Base-2407-chatml - model: unsloth/Mistral-Nemo-Instruct-2407 parameters: weight: [0.50, 0.20] density: [0.75, 0.55] - model: anthracite-org/magnum-12b-v2 parameters: weight: [0.50, 0.80] density: [0.75, 0.85] merge_method: ties base_model: IntervitensInc/Mistral-Nemo-Base-2407-chatml parameters: int8_mask: true rescale: true normalize: false dtype: bfloat16 ```