You need to agree to share your contact information to access this model

This repository is publicly accessible, but you have to accept the conditions to access its files and content.

Log in or Sign Up to review the conditions and access this model content.

This is a Gemma model uploaded using the KerasHub library and can be used with JAX, TensorFlow, and PyTorch backends. This model is related to a CausalLM task.

Model config:

  • name: gemma_backbone
  • trainable: True
  • vocabulary_size: 256000
  • num_layers: 26
  • num_query_heads: 8
  • num_key_value_heads: 4
  • hidden_dim: 2304
  • intermediate_dim: 18432
  • head_dim: 256
  • layer_norm_epsilon: 1e-06
  • dropout: 0
  • query_head_dim_normalize: True
  • use_post_ffw_norm: True
  • use_post_attention_norm: True
  • final_logit_soft_cap: 30.0
  • attention_logit_soft_cap: 50.0
  • sliding_window_size: 4096
  • use_sliding_window_attention: True

This model card has been generated automatically and should be completed by the model author. See Model Cards documentation for more information.

Downloads last month
0
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support