quant of brucethemoose's Yi-34B-200K-DARE-merge-v5

fits into 24gb with 16k context on windows

pippa_cleaned used for calibration, with 8192 token length

Downloads last month
3
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support