-
Notifications
You must be signed in to change notification settings - Fork 465
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Get custom headers in ChatResponse #377
Comments
Hey! You should be able to pass in httpx client options directly into the |
Thanks! Maybe I wasn't clear with the problem – I have tried to pass in options into to |
Ahhh I see what you mean - sorry about that. Let me dig into this a bit and see if there's a good pattern we can expose. We definitely do not support this now. For my knowledge - do you need to know the return of what nginx returns? Is this throwing an error or is it a "good to know/nice to have" |
I would love to know the return! In this case it's a string, more precise an internal IP and port number (one of the running Ollama servers) but I imagine there are other use cases as well. And it's not throwing an error, as I understand the |
Okay! Appreciate the explanation. Will leave this issue open for now and think about how we can possibly do this. Thank you! |
I'm using an Nginx server to direct requests made with the Ollama Python library to different running Ollama servers. The Nginx server selects an Ollama server based on availability, among other factors, and sends that information in the
X-Choosen-Server
header. Using Python'srequests
library, I can receive this header in the response. However, when using Ollama, it seems like I can't.Have I overlooked something, or would it be possible to implement custom headers in the
ChatResponse
?The text was updated successfully, but these errors were encountered: