Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

chatml-function-calling chat format fails to generate multi calls to the same tool #1756

Open
4 tasks done
jeffmaury opened this issue Sep 23, 2024 · 1 comment · May be fixed by #1758
Open
4 tasks done

chatml-function-calling chat format fails to generate multi calls to the same tool #1756

jeffmaury opened this issue Sep 23, 2024 · 1 comment · May be fixed by #1758

Comments

@jeffmaury
Copy link

Prerequisites

Please answer the following questions for yourself before submitting an issue.

  • I am running the latest code. Development is very rapid so there are no tagged versions as of now.
  • I carefully followed the README.md.
  • I searched using keywords relevant to my issue to make sure that I am creating a new issue that is not already open (or closed).
  • I reviewed the Discussions, and have a new bug or useful enhancement to share.

Expected Behavior

The response should return multiple calls to the tools function

Current Behavior

A single call is generated.

Environment and Context

Please provide detailed information about your computer setup. This is important in case the issue is not reproducible except for under certain specific conditions.

  • Physical (or virtual) hardware you are using, e.g. for Linux: Win11Pro on laptop

Failure Information (for bugs)

Please help provide information about the failure if this is a bug. If it is not a bug, please remove the rest of this template.

Steps to Reproduce

Please provide detailed steps for reproducing the issue. We are not sitting in front of your screen, so the more detail the better.

Download model file from https://huggingface.co/MaziyarPanahi/Mistral-7B-Instruct-v0.3-GGUF/resolve/main/Mistral-7B-Instruct-v0.3.Q4_K_M.gguf


import os
from llama_cpp.llama import Llama

llm = Llama(
    model_path="Mistral-7B-Instruct-v0.3.Q4_K_M.gguf", chat_format='chatml-function-calling'

)

SYSTEM_MESSAGE="""
You are a helpful assistant.
You can call functions with appropriate input when necessary.
You can call the same function several times.
"""

weather = {
  "type": "function",
  "function": {
    "name": "get_current_weather",
    "description": "Get the current weather in a given latitude and longitude",
    "parameters": {
      "type": "object",
      "properties": {
        "latitude": {
          "type": "number",
          "description": "The latitude of a place",
        },
        "longitude": {
          "type": "number",
          "description": "The longitude of a place",
        },
      },
      "required": ["latitude", "longitude"],
    },
  },
}

question = "What's the weather like in the following cities: Sydney and Paris ?"
messages = [
  {"role": "system", "content": SYSTEM_MESSAGE},
  {"role": "user", "content": question}
]
response = llm.create_chat_completion_openai_v1(messages, tools=[weather],tool_choice='auto')
print(response)

Failure Logs

N/A

@CISC
Copy link
Contributor

CISC commented Sep 24, 2024

See #1503, although it was done mainly for built-in chat templates it should also work for the chatml-function-calling format, albeit currently only when selecting the tool specifically, not with auto, though I'm sure that can be made to work as well.

jeffmaury added a commit to jeffmaury/llama-cpp-python that referenced this issue Sep 24, 2024
@jeffmaury jeffmaury linked a pull request Sep 24, 2024 that will close this issue
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants