Autogen with mistral/Pixtral 12B model hosted using Vllm #4790
Unanswered
ajithkumarkaliyappan
asked this question in
Q&A
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
I'm trying to run user.a_initiate_chats() by using llm (mistral/Pixtral 12B model) hosted using Vllm. I'm facing error even though I'm using chat template. following is the error message I'm getting
raise jinja2.exceptions.TemplateError(message)
| jinja2.exceptions.TemplateError: Conversation roles must alternate user/assistant/user/assistant/...
Beta Was this translation helpful? Give feedback.
All reactions