Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
Fixes token logging in callbacks when streaming=True is used. (#241)
Fixes #240 Fixes #217 ### Code to verify ```python from langchain_aws import ChatBedrock from langchain.callbacks.base import BaseCallbackHandler from langchain_core.prompts import ChatPromptTemplate streaming = True class MyCustomHandler(BaseCallbackHandler): def on_llm_new_token(self, token: str, **kwargs) -> None: print(f"My custom handler, token: {token}") prompt = ChatPromptTemplate.from_messages(["Tell me a joke about {animal} in a few words."]) model = ChatBedrock( model_id="anthropic.claude-3-haiku-20240307-v1:0", streaming = streaming, callbacks=[MyCustomHandler()] ) chain = prompt | model response = chain.invoke({"animal": "bears"}) ``` ### Output ``` My custom handler, token: My custom handler, token: Bear My custom handler, token: - My custom handler, token: ly funny My custom handler, token: . My custom handler, token: My custom handler, token: ```
- Loading branch information