meta-llama/Llama-3.2-1B-Instruct - W4A16 Compression

This is a compressed model using llmcompressor.

Compression Configuration

  • Base Model: meta-llama/Llama-3.2-1B-Instruct
  • Compression Scheme: W4A16
  • Dataset: HuggingFaceH4/ultrachat_200k
  • Dataset Split: train_sft
  • Number of Samples: 512
  • Preprocessor: chat
  • Maximum Sequence Length: 131072

Sample Output

Prompt:

<|begin_of_text|><|start_header_id|>system<|end_header_id|>

Cutting Knowledge Date: December 2023
Today Date: 28 Nov 2024

<|eot_id|><|start_header_id|>user<|end_header_id|>

Who is Alan Turing?<|eot_id|>

Output:

<|begin_of_text|><|begin_of_text|><|start_header_id|>system<|end_header_id|>

Cutting Knowledge Date: December 2023
Today Date: 28 Nov 2024

<|eot_id|><|start_header_id|>user<|end_header_id|>

Who is Alan Turing?<|eot_id|><|start_header_id|>assistant<|end_header_id|>

Alan Turing (1912-1954) was a British mathematician, computer scientist, logician, and philosopher. He is widely recognized for his groundbreaking work in computer science, artificial intelligence, and cryptography. He is often referred to as the "father of computer science" and is considered one of the most influential figures in the development of modern computing.

Turing's most notable contributions include:

1. **The Turing Machine**: In 1936, Turing proposed a theoretical model of a machine that could simulate the behavior of a human thinker, which laid the foundation for modern computer architecture.
2. **Codebreaking during

Evaluation

Downloads last month
10
Safetensors
Model size
655M params
Tensor type
I64
·
I32
·
BF16
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for espressor/meta-llama.Llama-3.2-1B-Instruct_W4A16

Quantized
(292)
this model

Dataset used to train espressor/meta-llama.Llama-3.2-1B-Instruct_W4A16