Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

LiteLLM support and reference config #480

Open
rcarmo opened this issue Nov 22, 2024 · 1 comment
Open

LiteLLM support and reference config #480

rcarmo opened this issue Nov 22, 2024 · 1 comment
Labels
enhancement New feature or request

Comments

@rcarmo
Copy link

rcarmo commented Nov 22, 2024

Is your feature request related to a problem? Please describe.
I would like to be able to use LiteLLM instead of direct OpenAI or Ollama configs

Describe the solution you'd like
I would like to have the ability to easily set a LiteLLM endpoint and key and pick the models to use.

Describe alternatives you've considered
Hacking this myself overriding the OpenAI endpoint config (which is confusing since the app cannot be solely configured via environment variables and things are inconsistent)

@rcarmo rcarmo added the enhancement New feature or request label Nov 22, 2024
@satonotdead
Copy link

I like it, should enable openrouter and and more options as well.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

2 participants