whisper-indic-merged

This is a merge of pre-trained language models created using mergekit.

Merge Details

Merge Method

This model was merged using the DARE TIES merge method using openai/whisper-large-v2 as a base.

Models Merged

The following models were included in the merge:

Configuration

The following YAML configuration was used to produce this model:

merge_method: dare_ties
base_model: openai/whisper-large-v2
modules:
  decoder:
    slices:
      - sources:
          - model: openai/whisper-large-v2
            layer_range: [0, 32]
            parameters:
              weight: 0.4
          - model: DrishtiSharma/whisper-large-v2-malayalam
            layer_range: [0, 32]
            parameters:
              weight: 0.2
          - model: DrishtiSharma/whisper-large-v2-marathi
            layer_range: [0, 32]
            parameters:
              weight: 0.2
          - model: DrishtiSharma/whisper-large-v2-punjabi
            layer_range: [0, 32]
            parameters:
              weight: 0.2
    parameters:
      density: 0.5

  encoder:
    slices:
      - sources:
          - model: openai/whisper-large-v2
            layer_range: [0, 32]
            parameters:
              weight: 0.4
          - model: DrishtiSharma/whisper-large-v2-malayalam
            layer_range: [0, 32]
            parameters:
              weight: 0.2
          - model: DrishtiSharma/whisper-large-v2-marathi
            layer_range: [0, 32]
            parameters:
              weight: 0.2
          - model: DrishtiSharma/whisper-large-v2-punjabi
            layer_range: [0, 32]
            parameters:
              weight: 0.2
    parameters:
      density: 0.7

parameters:
  normalize: true

dtype: bfloat16
tokenizer:
  source: union
model_type: whisper 
Downloads last month
45
Safetensors
Model size
1.54B params
Tensor type
BF16
ยท
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support

Model tree for sachaarbonel/whisper-large-v2-indic-merged-2