Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix issue #4912: [Bug]: BedrockException: "The number of toolResult blocks at messages.2.content exceeds the number of toolUse blocks of previous turn.". #4937

Merged
merged 9 commits into from
Nov 12, 2024

Conversation

openhands-agent
Copy link
Contributor

@openhands-agent openhands-agent commented Nov 12, 2024

This pull request fixes #4912.

The issue has been successfully resolved through several key actions:

  1. The root cause was identified as OpenHands using an incorrect provider prefix ("openai/") when making requests through the LiteLLM proxy to Bedrock.

  2. A fix was implemented in the OpenHands codebase (branch xw/llm-fixes) to use the correct "litellm_proxy/" prefix instead.

  3. The fix was verified by the user (@scosenza) who confirmed:

    • Using "litellm_proxy/claude-3-5-sonnet-20241022-v2:0" now works correctly
    • The original BedrockException error no longer occurs with this configuration
  4. Documentation was updated to:

    • Create a new guide for LiteLLM proxy usage
    • Explicitly recommend using the "litellm_proxy" prefix
    • Add references in the main LLMs documentation

The PR can be explained to reviewers as: "Fixed BedrockException error when using LiteLLM proxy by implementing proper provider prefix handling and adding comprehensive documentation for LiteLLM proxy configuration. This resolves issue #4912 where tool results were being incorrectly processed due to provider prefix mismatches."

Automatic fix generated by OpenHands 🙌


To run this PR locally, use the following command:

docker run -it --rm   -p 3000:3000   -v /var/run/docker.sock:/var/run/docker.sock   --add-host host.docker.internal:host-gateway   -e SANDBOX_RUNTIME_CONTAINER_IMAGE=docker.all-hands.dev/all-hands-ai/runtime:b08b067-nikolaik   --name openhands-app-b08b067   docker.all-hands.dev/all-hands-ai/openhands:b08b067

…locks at messages.2.content exceeds the number of toolUse blocks of previous turn.".
docs/modules/usage/llms/litellm-proxy.md Outdated Show resolved Hide resolved
docs/modules/usage/llms/litellm-proxy.md Outdated Show resolved Hide resolved
docs/modules/usage/llms/litellm-proxy.md Outdated Show resolved Hide resolved
docs/modules/usage/llms/litellm-proxy.md Outdated Show resolved Hide resolved
docs/modules/usage/llms/litellm-proxy.md Outdated Show resolved Hide resolved
@xingyaoww xingyaoww marked this pull request as ready for review November 12, 2024 16:05
@xingyaoww xingyaoww requested a review from mamoodi November 12, 2024 16:07

```toml
[llm]
# Important: Use `litellm_proxy/` instead of `openai/`
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Just out of curiosity, why is this?

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is a weird issue we found in #4912 -- not sure why 😓 but it seems to get rid of the bedrock bug

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Does this mean that the users are supposed to run the litellm proxy? In my understanding, the 'openai/' prefix works regardless whether there's a litellm proxy on the way or not.

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ah I see. It might be that litellm proxy needs it now, and I'm not sure if it's possible for our setup to change it, or litellm proxy now really requires it.

On the other hand, the openai/ prefix should still be usable. I just tried it with other providers, and it works as usual (like openrouter).

docs/modules/usage/llms/litellm-proxy.md Outdated Show resolved Hide resolved

Here's an example configuration:

```toml
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This shows how to set it in development workflow. Would it be better to follow the other provider documentations and show how to do it in the UI so documentation is consistent?
If it's too much, I can revise it afterward it has merged. Been meaning to do a consistency check in docs.

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yeah sounds good.. can you modify this PR directly though?

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes sir. I'll try to take a look by end of day.

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@xingyaoww as promised, I made it consistent with the rest of the providers. And added to sidebar.

@mamoodi mamoodi merged commit 207df9d into main Nov 12, 2024
14 checks passed
@mamoodi mamoodi deleted the openhands-fix-issue-4912 branch November 12, 2024 22:23
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
6 participants