glm-e1

This is a merge of pre-trained language models created using mergekit.

Merge Details

Merge Method

This model was merged using the Passthrough merge method using THUDM/GLM-4-32B-0414 + /alloc/axolotl/data/32b-lora-out/checkpoint-718 as a base.

Models Merged

The following models were included in the merge:

Configuration

The following YAML configuration was used to produce this model:

base_model: THUDM/GLM-4-32B-0414+/alloc/axolotl/data/32b-lora-out/checkpoint-718
dtype: bfloat16
merge_method: passthrough
models:
  - model: THUDM/GLM-4-32B-0414+/alloc/axolotl/data/32b-lora-out/checkpoint-718
Downloads last month
31
Safetensors
Model size
32.6B params
Tensor type
BF16
·
Inference Providers NEW
Input a message to start chatting with Edens-Gate/GLM-magnum-e1.

Model tree for Edens-Gate/GLM-magnum-e1

Finetuned
(10)
this model
Quantizations
1 model