fallen-glimmer-27b

This model is the result of an experimental multi-step SLERP merge:

glimmer-27b (intermediate merge)

models:
  - model: /opt/workspace/hf/gemma-3-27b-it-txtonly
  - model: /opt/workspace/hf/Gemma-3-Glitter-27B-txtonly
merge_method: slerp
base_model: /opt/workspace/hf/gemma-3-27b-it-txtonly
parameters:
  t: 0.33333
dtype: bfloat16

fallen-glimmer-27b (this model)

models:
  - model: /opt/workspace/hf/glimmer-27b
  - model: /opt/workspace/hf/Fallen-Gemma3-27B-v1-txtonly
merge_method: slerp
base_model: /opt/workspace/hf/glimmer-27b
parameters:
  t:
    - filter: embed_tokens
      value: 0.0 # token_embd is from glimmer
    - value: [0.25, 0.75, 1.0, 0.75, 0.25]
dtype: bfloat16

is it good?

i can't tell

Downloads last month
6
Safetensors
Model size
27B params
Tensor type
BF16
ยท
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support

Model tree for ddh0/fallen-glimmer-27b