Mixture of Experts Time-MoE: Billion-Scale Time Series Foundation Models with Mixture of Experts Paper β’ 2409.16040 β’ Published Sep 24, 2024 β’ 16
Time-MoE: Billion-Scale Time Series Foundation Models with Mixture of Experts Paper β’ 2409.16040 β’ Published Sep 24, 2024 β’ 16
Useful Models alaa-lab/InstructCV Image-to-Image β’ Updated Feb 19, 2024 β’ 5 β’ 10 Vision-CAIR/vicuna-7b Text Generation β’ Updated May 22, 2023 β’ 1.3k β’ 24 shibing624/ziya-llama-13b-medical-merged Text Generation β’ Updated Feb 19, 2024 β’ 13 β’ 26 chaoyi-wu/PMC_LLAMA_7B Text Generation β’ Updated May 17, 2023 β’ 1.29k β’ 65
Mixture of Experts Time-MoE: Billion-Scale Time Series Foundation Models with Mixture of Experts Paper β’ 2409.16040 β’ Published Sep 24, 2024 β’ 16
Time-MoE: Billion-Scale Time Series Foundation Models with Mixture of Experts Paper β’ 2409.16040 β’ Published Sep 24, 2024 β’ 16
Useful Models alaa-lab/InstructCV Image-to-Image β’ Updated Feb 19, 2024 β’ 5 β’ 10 Vision-CAIR/vicuna-7b Text Generation β’ Updated May 22, 2023 β’ 1.3k β’ 24 shibing624/ziya-llama-13b-medical-merged Text Generation β’ Updated Feb 19, 2024 β’ 13 β’ 26 chaoyi-wu/PMC_LLAMA_7B Text Generation β’ Updated May 17, 2023 β’ 1.29k β’ 65
svorwerk/setfit-fine-tuned-demo-class_hpo Text Classification β’ 0.1B β’ Updated Jan 27, 2024 β’ 1