Grok-1

This repository contains the weights of the Grok-1 open-weights model. You can find the code in the GitHub Repository.

Download instruction

Clone the repo & download the int8 checkpoint to the checkpoints directory by executing this command in the repo root directory:

git clone https://github.com/xai-org/grok-1.git && cd grok-1
pip install huggingface_hub[hf_transfer]
huggingface-cli download xai-org/grok-1 --repo-type model --include ckpt-0/* --local-dir checkpoints --local-dir-use-symlinks False

Then, you can run:

pip install -r requirements.txt
python run.py

You should be seeing output from the language model.

Due to the large size of the model (314B parameters), a multi-GPU machine is required to test the model with the example code.

p.s. we're hiring: https://x.ai/careers

Downloads last month
930
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.
The model cannot be deployed to the HF Inference API: The HF Inference API does not support text-generation models for grok library.

Model tree for xai-org/grok-1

Finetunes
2 models

Spaces using xai-org/grok-1 19