Attempt to reproduce Mixture-of-LoRAs classifier

Mixture-of-LoRAs: An Efficient Multitask Tuning for Large Language Models

https://arxiv.org/pdf/2403.03432

Datasets

We evenly sample about 10k training data and 2k validation data on each dataset.

From laion/OIG was taken only:

  • unified_merged_code_xp3.jsonl
  • unified_grade_school_math_instructions.jsonl
  • unified_mathqa_flanv2_kojma_cot.jsonl
Downloads last month
8
Inference Providers NEW
This model is not currently available via any of the supported third-party Inference Providers, and the model is not deployed on the HF Inference API.

Datasets used to train evilfreelancer/moa-classification