[Experimental model]

This model is an experiment using the frankenstein script from https://huggingface.co/chargoddard/llama2-22b BLOCK_DIAGONAL = False

Using: https://huggingface.co/The-Face-Of-Goonery/Huginn-13b-FP16 + Then used https://huggingface.co/upstage/llama-30b-instruct-2048 as donor model.

It used 160GB of system ram to merge these models, they merge fast without swap.

For prompt template and model information see huginnV1.

Downloads last month
14
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support