Mistral-NeMo-Minitron-8B-ARChitects-ReArc1200-bnb-4bit
Model Overview
Mistral-NeMo-Minitron-8B-ARChitects-ReArc1200-bnb-4bit is a retrained variant of Nvidia Mistral-NeMo-Minitron-8B-Base, finetuned specifically to solve ARC-AGI tasks. In order to save GPU memory, the embedding and vocabulary size have been reduced to only 77 tokens. The model achieved a score of 71.6% (with test-time retraining) on the ARC-AGI public evaluation set, with only the ReArc data set being used during finetuning. Please refer to our github repository for more details. For more models tuned for ARC-AGI, check out our model collection.
Finetuning Datasets
This model was finetuned on the following datasets:
- the ReArc data set by Michael Hodel
License
This model is released under the NVIDIA Open Model License Agreement.
Usage
This model can be used with the transformers
or unsloth
packages. For more information on preprocessing the ARC Prize tasks to generate prompts for the model, please refer to our github repository.
References
- Downloads last month
- 11,269