Text Generation
Transformers
Safetensors
English
olmo
Inference Endpoints

Olmo-7B missing checkpoints (model revision) + incompatible with vllm

#3
by sert121 - opened
  • The model hasn't been completely ported from OLMo-7b, and misses out in the checkpoints that were fully available in Olmo7B, the older model currently gives errors in vllm as well so both the models seem unusable currently with vllm.
sert121 changed discussion title from Olmo-7B missing checkpoints (model revision) to Olmo-7B missing checkpoints (model revision) + incompatible with vllm

Hello @sert121 , OLMo is compatible with VLLM. You can look at this: https://docs.vllm.ai/en/latest/models/supported_models.html.
If you have any particular intermediate checkpoints you are interested in using, then one option is to convert these to HF format yourself (it takes maybe 5-10 mins per checkpoint once you have the checkpoint on your machine). The instructions are in checkpoints.md. The idea is to find the official checkpoint you want in https://github.com/allenai/OLMo/blob/main/checkpoints/official and then use convert_olmo_to_hf_new.py to convert it to HF format.

Sign up or log in to comment