You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I am testing SagemakerEndpoint(Llama3 Instruct) with RunnableWithMessageHistory. This endpoint is using specfic template and the generated template from RunnableWithMessageHistory is broken as the messages from History dont follow formatting.
Code example:
prompt_data=""" <|begin_of_text|> <|start_header_id|>system<|end_header_id|> {system} <|eot_id|> {chat_history} <|start_header_id|>user<|end_header_id|> {input} <|eot_id|> <|start_header_id|>assistant<|end_header_id|> """prompt_template=PromptTemplate(template=prompt_data)
history=StreamlitChatMessageHistory(key="chat_history")
classContentHandler(LLMContentHandler):
content_type="application/json"accepts="application/json"deftransform_input(self, prompt: str, model_kwargs: Dict) ->bytes:
input_str=json.dumps({"inputs": prompt, "parameters": model_kwargs})
returninput_str.encode("utf-8")
deftransform_output(self, output: bytes) ->str:
response_json=json.loads(output.read().decode("utf-8"))
returnresponse_json[0]["generated_text"]
llm=SagemakerEndpoint(...)
chain=prompt_template|llmwrapped_chain=RunnableWithMessageHistory(chain,
lambdasession_id: history,
history_messages_key="chat_history")
response=wrapped_chain.invoke({"input": "What can you do?", "system": "You are a good assistant."}, config)
The output of the call when invoked is:
<|begin_of_text|>
<|start_header_id|>system<|end_header_id|>
You are a good assistant.
<|eot_id|>
[AIMessage(content='How can I help you?')]
<|start_header_id|>user<|end_header_id|>
What can you do?
<|eot_id|>
<|start_header_id|>assistant<|end_header_id|>
And as you can see the issue is that messages from history are not formatted.
Is there a way to format history messages?
I tried:
See can I do it in when providing function which returns history. Not able to do it.
Do the formatting in ContentHandler transform_input but it is a string there which is mixed bag as you can see.
Create template using ChatPromptTemplate.from _messages and then try again 2. Again string which is not the best to handle as it can contain a lot of bad things.
Checked can I reuse langchain_aws.chat_models.bedrock.convert_messages_to_prompt_llama3 and this failed.
Any suggestion or explanation about this issue is more then welcome.
All best
Bojan
The text was updated successfully, but these errors were encountered:
I am testing SagemakerEndpoint(Llama3 Instruct) with RunnableWithMessageHistory. This endpoint is using specfic template and the generated template from RunnableWithMessageHistory is broken as the messages from History dont follow formatting.
Code example:
The output of the call when invoked is:
And as you can see the issue is that messages from history are not formatted.
Is there a way to format history messages?
I tried:
Any suggestion or explanation about this issue is more then welcome.
All best
Bojan
The text was updated successfully, but these errors were encountered: