We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Hi! Is the prompt caching support hardcoded to work only for direct Anthropic API?
I'm using Claude via Vertex AI and the prompt cache label isn't showing up even though Vertex does support prompt caching for Anthropic models: https://cloud.google.com/vertex-ai/generative-ai/docs/partner-models/claude-prompt-caching
prompt cache
The command I'm running
VERTEXAI_PROJECT=redacted VERTEXAI_LOCATION=redacted aider --model vertex_ai/claude-3-5-sonnet-v2@20241022 --weak-model vertex_ai/gemini-1.5-flash-002 --dark-mode --cache-prompts --no-auto-commits
Aider v0.69.1 Main model: vertex_ai/claude-3-5-sonnet-v2@20241022 with diff edit format, infinite output Weak model: vertex_ai/gemini-1.5-flash-002 Git repo: .git with 68 files Repo-map: using 1024 tokens, files refresh
The text was updated successfully, but these errors were encountered:
No branches or pull requests
Issue
Hi!
Is the prompt caching support hardcoded to work only for direct Anthropic API?
I'm using Claude via Vertex AI and the
prompt cache
label isn't showing up even though Vertex does support prompt caching for Anthropic models: https://cloud.google.com/vertex-ai/generative-ai/docs/partner-models/claude-prompt-cachingThe command I'm running
Version and model info
Aider v0.69.1
Main model: vertex_ai/claude-3-5-sonnet-v2@20241022 with diff edit format, infinite output
Weak model: vertex_ai/gemini-1.5-flash-002
Git repo: .git with 68 files
Repo-map: using 1024 tokens, files refresh
The text was updated successfully, but these errors were encountered: