# Meta-Llama-3.1-405B-Instruct-v16-k65536-256-woft-perm This is based on https://huggingface.co/VPTQ-community/Meta-Llama-3.1-405B-Instruct-v16-k65536-256-woft, with the perms absorbed with: ```sh pip install vptq && python -m vptq.tools.pre_process --input_path --output_path ```