Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

When the workflow configuration becomes long, the stream output format is lost #11985

Open
5 tasks done
ITMeow opened this issue Dec 23, 2024 · 3 comments
Open
5 tasks done
Labels
🐞 bug Something isn't working 🌊 feat:workflow Workflow related stuff.

Comments

@ITMeow
Copy link

ITMeow commented Dec 23, 2024

Self Checks

  • This is only for bug report, if you would like to ask a question, please head to Discussions.
  • I have searched for existing issues search for existing issues, including closed ones.
  • I confirm that I am using English to submit this report (我已阅读并同意 Language Policy).
  • [FOR CHINESE USERS] 请务必使用英文提交 Issue,否则会被关闭。谢谢!:)
  • Please do not modify this template :) and fill in all the required fields.

Dify version

0.14.1

Cloud or Self Hosted

Self Hosted (Docker)

Steps to reproduce

From the image, it can be seen that when there are many nodes configured in front, such as knowledge base retrieval and code execution, when the process reaches the last LLM node, the streaming output turns into a blocking mode output, including API output as well
bug

✔️ Expected Behavior

When the last LLM node outputs, keep the streaming output

❌ Actual Behavior

Now in blocking output

@dosubot dosubot bot added 🌊 feat:workflow Workflow related stuff. 🐞 bug Something isn't working labels Dec 23, 2024
Copy link

dosubot bot commented Dec 23, 2024

Hey @ITMeow! 👋 I'm here to help you with any bugs, questions, or contributions you may have while waiting for a human maintainer. Let's squash those bugs together!

To maintain the streaming output format in a self-hosted Dify workflow when the configuration becomes long and reaches the last LLM node, you should add a Direct Reply Node inside the iteration node. This setup allows for streaming output after each iteration, ensuring that the output is maintained in a streaming format even when the configuration is extensive [1].

To continue talking to Dosu, mention @dosu.


Help Dosu learn! Give it feedback: Great Response | Irrelevant Answer | Incorrect Sources | Too Verbose | Hallucination | Other

@ITMeow
Copy link
Author

ITMeow commented Dec 23, 2024

Hey @ITMeow! 👋 I'm here to help you with any bugs, questions, or contributions you may have while waiting for a human maintainer. Let's squash those bugs together!

To maintain the streaming output format in a self-hosted Dify workflow when the configuration becomes long and reaches the last LLM node, you should add a Direct Reply Node inside the iteration node. This setup allows for streaming output after each iteration, ensuring that the output is maintained in a streaming format even when the configuration is extensive [1].

To continue talking to Dosu, mention @dosu.

Help Dosu learn! Give it feedback: Great Response | Irrelevant Answer | Incorrect Sources | Too Verbose | Hallucination | Other

I think you misunderstood me, I said that there are many nodes in chatflow, not long text, and I did not use iterative nodes, until update 0.14.0, llm output is able to display the streaming form well

@SingleKey
Copy link

After I removed the variable aggregator, the output is still blocking rather than streaming. Could you please guide me on how to modify the yml file to resolve this issue?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
🐞 bug Something isn't working 🌊 feat:workflow Workflow related stuff.
Projects
None yet
Development

No branches or pull requests

2 participants