💨 QVikhr-2.5-1.5B-Instruct-r
Инструктивная модель на основе QVikhr-2.5-1.5B-Instruct-r, обученная на русскоязычном датасете ru Math.
Описание:
Vikhrmodels/QVikhr-2.5-1.5B-Instruct-r представляет собой языковую модель, прошедшую специализированное обучение с использованием метода RuMath.
Transformers
Авторы
- Sergei Bratchikov, NLP Wanderer, Vikhr Team
- Aleksandr Nikolich, Vikhr Team
- Nikolay Kompanets, LakoMoor, Vikhr Team
- Konstantin Korolev, Vikhr Team
@inproceedings{nikolich2024vikhr,
title={Vikhr: Advancing Open-Source Bilingual Instruction-Following Large Language Models for Russian and English},
author={Aleksandr Nikolich and Konstantin Korolev and Sergei Bratchikov and Nikolay Kompanets and Igor Kiselev and Artem Shelmanov},
booktitle={Proceedings of the 4th Workshop on Multilingual Representation Learning (MRL) @ EMNLP-2024},
year={2024},
publisher={Association for Computational Linguistics},
url={https://arxiv.org/pdf/2405.13929}
}
- Downloads last month
- 1,041
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
🙋
Ask for provider support
HF Inference deployability: The model has no pipeline_tag.
Model tree for Vikhrmodels/QVikhr-2.5-1.5B-Instruct-r_GGUF
Base model
Qwen/Qwen2.5-1.5B
Finetuned
Vikhrmodels/QVikhr-2.5-1.5B-Instruct-r