Can't load model

#1
by wonderfuldestruction - opened

Tried BF16 and Q8_0 so far and get this error:

unable to load model: /root/.ollama/models/blobs/sha256-cc0b2a3f447e134cafd2853104d06227122cc280f4c9fee8c90172066174ef04

me neither
unable to load model
I tested at 'ollama'

LG AI Research org

Thank you for your attention.

Unfortunately, EXAONE 4.0 is currently not supported by ollama.
You can find the PR related to this issue, and we expect that there will be an update following the recent update to llama.cpp.

Sign up or log in to comment