Update README.md
Browse files
README.md
CHANGED
@@ -16,7 +16,7 @@ library_name: transformers
|
|
16 |
# CultureMERT: Continual Pre-Training for Cross-Cultural Music Representation Learning
|
17 |
📑 [**Read the full paper (to be presented at ISMIR 2025)**](...TODO)
|
18 |
|
19 |
-
**CultureMERT-TA-95M** is a 95M-parameter music foundation model adapted to diverse musical cultures through **task arithmetic
|
20 |
|
21 |
|
22 |
| Dataset | Music Tradition | Hours Used |
|
|
|
16 |
# CultureMERT: Continual Pre-Training for Cross-Cultural Music Representation Learning
|
17 |
📑 [**Read the full paper (to be presented at ISMIR 2025)**](...TODO)
|
18 |
|
19 |
+
**CultureMERT-TA-95M** is a 95M-parameter music foundation model adapted to diverse musical cultures through [**task arithmetic**](https://arxiv.org/abs/2212.04089). Instead of direct continual pre-training on a multi-cultural mixture, as in [CultureMERT-95M](https://huggingface.co/ntua-slp/CultureMERT-95M), this model merges multiple **single-culture adapted** variants of [MERT-v1-95M](https://huggingface.co/m-a-p/MERT-v1-95M)—each continually pre-trained via our two-stage strategy on a distinct musical tradition:
|
20 |
|
21 |
|
22 |
| Dataset | Music Tradition | Hours Used |
|