from transformers import AutoTokenizer, AutoModelForCausalLM
tokenizer = AutoTokenizer.from_pretrained('quidangz/LlamaNER-8B-Instruct-ZeroShot')
model = AutoModelForCausalLM.from_pretrained(
    'quidangz/LlamaNER-8B-Instruct-ZeroShot',
    torch_dtype="auto",
    device_map="cuda",
)

if tokenizer.pad_token is None:
    tokenizer.pad_token = tokenizer.eos_token
    model.config.pad_token_id = model.config.eos_token_id

user_prompt = """
            Extract entities from the text **strictly using ONLY the provided Entity List** below and **MUST** strictly adhere to the output format.
            Format output as '<entity tag>: <entity name>' and separated multiple entities by '|'. Return 'None' if no entities are identified.
            Entity List: {ner_labels}
            Text: {text}
"""

query = 'Hence, quercetin effectively reversed NAFLD symptoms by decreased triacyl glycerol accumulation, insulin resistance, inflammatory cytokine secretion and increased cellular antioxidants in OA induced hepatic steatosis in HepG2 cells.'
ner_labels = ['Chemical']

user_prompt = user_prompt.format(ner_labels=ner_labels, text=query)

messages = [
  {
      "role": "system",
      "content": "You are an expert in Named Entity Recognition (NER) task."
  },
  {
      "role": "user",
      "content": user_prompt
  }
]

text = tokenizer.apply_chat_template(
        messages,
        tokenize=False,
        add_generation_prompt=True
    )
    
model_inputs = tokenizer(text, return_tensors="pt").to(model.device)

generated_ids = model.generate(
    **model_inputs,
    max_new_tokens=512,
)

generated_ids = [
    output_ids[len(input_ids):] for input_ids, output_ids in zip(model_inputs.input_ids, generated_ids)
]

response = tokenizer.batch_decode(generated_ids, skip_special_tokens=True)[0]

print(response) # Chemical: quercetin | Chemical: triacyl glycerol

Contact

Email: [email protected]

LinkedIn: Qui Dang

Facebook: ฤแบทng Bรก Qรบi

Citation

Please cite as

@misc{LlamaNER-8B-Instruct-ZeroShot,
  title={LlamaNER: An Large Language Model for Named Entity Recognition},
  author={Qui Dang Ba},
  year={2025},
  publisher={Huggingface},
}
Downloads last month
40
Safetensors
Model size
8.03B params
Tensor type
BF16
ยท
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support

Model tree for quidangz/LlamaNER-8B-Instruct-ZeroShot