Custom LLM integration with Voice Agent #1034
Replies: 3 comments
-
Thanks for asking your question. Please be sure to reply with as much detail as possible so the community can assist you efficiently. |
Beta Was this translation helpful? Give feedback.
-
Hey there! It looks like you haven't connected your GitHub account to your Deepgram account. You can do this at https://community.deepgram.com - being verified through this process will allow our team to help you in a much more streamlined fashion. |
Beta Was this translation helpful? Give feedback.
-
It looks like we're missing some important information to help debug your issue. Would you mind providing us with the following details in a reply?
|
Beta Was this translation helpful? Give feedback.
-
I'm trying to integrate my custom LLM with Deepgram Voice Agent. When using their default providers (OpenAI/Anthropic), everything works fine. With OpenAI, I see this response format in the WebSocket messages:
I've set up my custom LLM REST endpoint at /chat in my localhost and exposed it publicly using NGROK. and configured the agent:
My LLM receives and processes requests successfully (confirmed via logs), but the responses never appear in the WebSocket messages. What's the correct response format for custom LLM integration?
My server.js file looks like
Sample config for deepgram Agent:
I am using this repository as a interface https://github.com/deepgram-devs/deepgram-voice-agent-demo.
Environment:
Beta Was this translation helpful? Give feedback.
All reactions