You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I confirm that I am using English to submit this report (我已阅读并同意 Language Policy).
[FOR CHINESE USERS] 请务必使用英文提交 Issue,否则会被关闭。谢谢!:)
Please do not modify this template :) and fill in all the required fields.
Dify version
0.14.1
Cloud or Self Hosted
Self Hosted (Docker)
Steps to reproduce
From the image, it can be seen that when there are many nodes configured in front, such as knowledge base retrieval and code execution, when the process reaches the last LLM node, the streaming output turns into a blocking mode output, including API output as well
✔️ Expected Behavior
When the last LLM node outputs, keep the streaming output
❌ Actual Behavior
Now in blocking output
The text was updated successfully, but these errors were encountered:
Hey @ITMeow! 👋 I'm here to help you with any bugs, questions, or contributions you may have while waiting for a human maintainer. Let's squash those bugs together!
To maintain the streaming output format in a self-hosted Dify workflow when the configuration becomes long and reaches the last LLM node, you should add a Direct Reply Node inside the iteration node. This setup allows for streaming output after each iteration, ensuring that the output is maintained in a streaming format even when the configuration is extensive [1].
Hey @ITMeow! 👋 I'm here to help you with any bugs, questions, or contributions you may have while waiting for a human maintainer. Let's squash those bugs together!
To maintain the streaming output format in a self-hosted Dify workflow when the configuration becomes long and reaches the last LLM node, you should add a Direct Reply Node inside the iteration node. This setup allows for streaming output after each iteration, ensuring that the output is maintained in a streaming format even when the configuration is extensive [1].
I think you misunderstood me, I said that there are many nodes in chatflow, not long text, and I did not use iterative nodes, until update 0.14.0, llm output is able to display the streaming form well
After I removed the variable aggregator, the output is still blocking rather than streaming. Could you please guide me on how to modify the yml file to resolve this issue?
Self Checks
Dify version
0.14.1
Cloud or Self Hosted
Self Hosted (Docker)
Steps to reproduce
From the image, it can be seen that when there are many nodes configured in front, such as knowledge base retrieval and code execution, when the process reaches the last LLM node, the streaming output turns into a blocking mode output, including API output as well
✔️ Expected Behavior
When the last LLM node outputs, keep the streaming output
❌ Actual Behavior
Now in blocking output
The text was updated successfully, but these errors were encountered: