Mgpt: A Fine-tuned Mixtral Model

Mgpt is a fine-tuned version of the Mixtral model, optimized for various natural language processing tasks. It leverages the power of large-scale language models to generate high-quality text outputs for a wide range of applications.

Overview

Mgpt is built upon the Mixtral model, which is a variant of the popular GPT (Generative Pre-trained Transformer) architecture. The Mixtral model is trained on a diverse range of text data and fine-tuned for specific tasks using transfer learning techniques.

Downloads last month
15
Safetensors
Model size
7.24B params
Tensor type
BF16
ยท
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support