Allow using/configuring stop sequences #366
alexcontor
started this conversation in
Feature Requests
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Stop sequences are used to make the model stop generating tokens at a desired point, such as the end of a sentence or a list. It would be very useful if we could define an array of possible stop sequences to force the model to stop generating. This is specially useful when using Chat Prompt Templates because some LLM tend to continue the conversation exceeding their system role, appending extra undesired messages.
Beta Was this translation helpful? Give feedback.
All reactions