akanatas commited on
Commit
ed246f9
verified
1 Parent(s): 480a220

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +2 -1
README.md CHANGED
@@ -29,7 +29,8 @@ library_name: transformers
29
  > 馃И The final model was merged using a scaling factor of **位 = 0.2**, which yielded the best overall performance across all task arithmetic variants evaluated.
30
 
31
 
32
- 馃攢 This model serves as an alternative to [**CultureMERT-95M**](https://huggingface.co/ntua-slp/CultureMERT-95M), where culturally specialized models are merged in weight space via task arithmetic to form a unified multi-cultural model. It builds directly on [CultureMERT-95M](https://huggingface.co/ntua-slp/CultureMERT-95M), using the same two-stage continual pre-training strategy, applied separately to each musical tradition prior to merging.
 
33
 
34
  ---
35
 
 
29
  > 馃И The final model was merged using a scaling factor of **位 = 0.2**, which yielded the best overall performance across all task arithmetic variants evaluated.
30
 
31
 
32
+
33
+ 馃攢 This model serves as an alternative to [**CultureMERT-95M**](https://huggingface.co/ntua-slp/CultureMERT-95M). It merges culturally specialized models in weight space via task arithmetic to form a unified multi-cultural model. Each single-culture adapted model is obtained using the same two-stage continual pre-training strategy as CultureMERT-95M, applied separately to each musical tradition prior to merging.
34
 
35
  ---
36