-
Notifications
You must be signed in to change notification settings - Fork 1.1k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Ollama tool calls not working via openai proxy only when using langgraph #1153
Comments
Oh interesting. How many times did you run both versions? Is it reliably different in both contexts? And then you've confirmed the ollama versions are the same in both scenarios? |
Hi @hinthornw , I ran them multiple times (and once more just now just to be sure) and the behaviour is consistent. Tested both on ollama v0.3.0. |
It seems to be cause by ...
async def test():
async for event in app.astream(
{"messages": [HumanMessage(content="what is the weather in sf")]},
config={"configurable": {"thread_id": 42}},
):
print(event)
asyncio.run(test()) {'agent': {'messages': [AIMessage(content='', additional_kwargs={'tool_calls': [{'id': 'call_rs3ykbgl', 'function': {'arguments': '{"query":"sf weather"}', 'name': 'search'}, 'type': 'function'}]}, response_metadata={'token_usage': {'completion_tokens': 17, 'prompt_tokens': 147, 'total_tokens': 164}, 'model_name': 'llama3.1', 'system_fingerprint': 'fp_ollama', 'finish_reason': 'stop', 'logprobs': None}, id='run-3175464b-50ce-4fa7-afc7-73fe14cf92ee-0', tool_calls=[{'name': 'search', 'args': {'query': 'sf weather'}, 'id': 'call_rs3ykbgl', 'type': 'tool_call'}], usage_metadata={'input_tokens': 147, 'output_tokens': 17, 'total_tokens': 164})]}}
{'tools': {'messages': [ToolMessage(content="It's 60 degrees and foggy.", name='search', tool_call_id='call_rs3ykbgl')]}}
{'agent': {'messages': [AIMessage(content='Based on the tool call response, I can format an answer to your original question:\n\nThe current weather in San Francisco (SF) is 60 degrees with fog.', response_metadata={'token_usage': {'completion_tokens': 34, 'prompt_tokens': 86, 'total_tokens': 120}, 'model_name': 'llama3.1', 'system_fingerprint': 'fp_ollama', 'finish_reason': 'stop', 'logprobs': None}, id='run-0c048f94-63ae-45d1-9808-4083dd65ec0d-0', usage_metadata={'input_tokens': 86, 'output_tokens': 34, 'total_tokens': 120})]}} |
I encountered the same issue, how to get tool_calls and event when using astream_events? should set specific config? or it is not supported in langgraph? |
it's not work for me... I still cannot call functions... {'messages': ['what is the weather in sf',
AIMessage(content='', additional_kwargs={'tool_calls': [{'id': 'call_wurji20e', 'function': {'arguments': '{"query":"weather in sf"}', 'name': 'search'}, 'type': 'function'}], 'refusal': None}, response_metadata={'token_usage': {'completion_tokens': 18, 'prompt_tokens': 209, 'total_tokens': 227, 'completion_tokens_details': None, 'prompt_tokens_details': None}, 'model_name': 'llama3.1', 'system_fingerprint': 'fp_ollama', 'finish_reason': 'tool_calls', 'logprobs': None}, id='run-8834569e-3531-4c0e-a47b-bca59c600db1-0', tool_calls=[{'name': 'search', 'args': {'query': 'weather in sf'}, 'id': 'call_wurji20e', 'type': 'tool_call'}], usage_metadata={'input_tokens': 209, 'output_tokens': 18, 'total_tokens': 227, 'input_token_details': {}, 'output_token_details': {}})]} |
Thanks, it works. |
Checked other resources
Example Code
Error Message and Stack Trace (if applicable)
No response
Description
I want to invoke a tool-calling compatible Ollama model through
ChatOpenAI
proxy. However, using the code above, the model does not properly tool call:However, the behaviour is different when using just langchain:
This way the model correctly utilise a tool call:
System Info
langchain==0.2.7
langchain-anthropic==0.1.20
langchain-cohere==0.1.5
langchain-community==0.2.7
langchain-core==0.2.21
langchain-google-genai==1.0.5
langchain-ollama==0.1.0
langchain-openai==0.1.17
langchain-qdrant==0.1.1
langchain-text-splitters==0.2.0
langchain-weaviate==0.0.1.post1
platform: mac silicon
python version: Python 3.12.2
The text was updated successfully, but these errors were encountered: