The Inference Code is not working properly

#1
by serdarcaglar - opened
# pip install transformers peft librosa

import transformers
import numpy as np
import librosa

pipe = transformers.pipeline(model='fixie-ai/ultravox-v0_4_1-llama-3_1-8b', trust_remote_code=True)

path = "<path-to-input-audio>"  # TODO: pass the audio here
audio, sr = librosa.load(path, sr=16000)


turns = [
  {
    "role": "system",
    "content": "You are a friendly and helpful character. You love to answer questions for people."
  },
]
pipe({'audio': audio, 'turns': turns, 'sampling_rate': sr}, max_new_tokens=30)

return this error: AttributeError: 'NoneType' object has no attribute 'tokenizer'

Fixie.ai org

Thanks for reporting this. We have updated the model. Please let me know if there is still a problem.

This could also be due to the new transformers overriding processor field on our pipeline. I made a one line fix here: https://huggingface.co/fixie-ai/ultravox-v0_4_1-llama-3_1-8b/commit/80e788d2a063ec4d631493db6614b482ddbde7e7

I didn't get to verify the fix yet, but let us know if you try it.

Sign up or log in to comment