How can I use this model in llamaindex to complete function calling?

#3
by argool - opened

Question
How to use function calling from hugging face's model?

This code works.

llm_model = OpenAI(
api_key=OPEN_AI_KEY,
model="gpt-3.5-turbo",
temperature=0
)

def setup_agent(self):
tools = self.registry.get_tools_for_analysis()
self.agent_worker = FunctionCallingAgentWorker.from_tools(tools, llm=llm_model, verbose = True)
self.agent = self.agent_worker.as_agent()

But this does not work.

llm_model = HuggingFaceInferenceAPI(
model_name='NousResearch/Hermes-3-Llama-3.2-3B',
api_key=HF_TOKEN
)

def setup_agent(self):
tools = self.registry.get_tools_for_analysis()
self.agent_worker = FunctionCallingAgentWorker.from_tools(tools, llm=llm_model, verbose = True)
self.agent = self.agent_worker.as_agent()

I tried to wrap HuggingfaceInferenceAPI with FunctionCallingLLM, but it still does not work, as I expect.

So question,

does llamaindex support function calling other than OpenAI?
if so, how to implement it?
Thank you.

Sign up or log in to comment