Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Feature: Prompt Template Registry #334

Open
massi-ang opened this issue Jan 25, 2024 · 5 comments
Open

Feature: Prompt Template Registry #334

massi-ang opened this issue Jan 25, 2024 · 5 comments
Labels
enhancement New feature or request

Comments

@massi-ang
Copy link
Collaborator

massi-ang commented Jan 25, 2024

Description:
the current solution supports fixed prompt templates defined in the LLM adapters lambda functions. While this provide flexibility to change the prompts it does not easily enable experimentation.
We are proposing to add a separate Prompt Template Registry to store multiple versions of prompts linked to specific model adapters.
The prompt registry service needs to:

  1. store up to X versions of a prompt template set per model adapter. A prompt set is defined as 1 or multiple prompts required by the model interface to operate (idefics requires 1 prompt, langchain requires 3 prompts, etc)
  2. maintain an editable draft version
  3. OPTIONAL: create a readonly version from the current DRAFT
  4. provide backend validatation of prompts based on the interface. For example: Langchain prompt templates require specific placeholders depending on the prompt (std, qa, condense);
  5. allow users to quickly experiments with variations of prompts in a set via the UI
  6. provide validation/warning in the UI when the prompt entered does not match what required by the model: Eg Human/Assistant pattern for Claude, or [INST] pattern for LLama2 and derivatives

The current functionality needs to be modified to make use of the prompt registry and in particular the chat bot playground MUST allow the user to select version of the prompt set compatibile with the selected model. This choice MUST be persisted locally for each combination used (default is using the DRAFT prompt set).

The initial DRAFT prompt set for each model corresponds to the current prompt templates defined in the model adapters.

@massi-ang
Copy link
Collaborator Author

massi-ang commented Jan 25, 2024

Replaces #133 and #72

@massi-ang massi-ang added the enhancement New feature or request label Jan 25, 2024
@Rob-Powell
Copy link
Contributor

this is sorely needed, @massi-ang have you started on this on yet? if not I will take it on.

@massi-ang
Copy link
Collaborator Author

We are considering the integration of Promptus https://github.com/aws-samples/promptus. Can you provide your feedback on that tool if you have time? Could you work on the integration?

@Rob-Powell
Copy link
Contributor

That looks great, yes I will take it on and start looking at pulling its functionality in here

@scottmraz
Copy link

scottmraz commented Sep 12, 2024

Any update on Promptus support?

A quick fix would be just use ReACT to start in the base adapter. Haven't tested, but would this work to at least get better responses universally?

https://github.com/aws-samples/aws-genai-llm-chatbot/blob/3e3d2838385db22bf9f09b9ddefb567840a40991/lib/model-interfaces/langchain/functions/request-handler/adapters/base/base.py

    def get_prompt(self):
        REACT_PROMPT_TEMPLATE = """You are an AI assistant using the REACT (Reason + Act) framework to solve tasks step by step. Follow these guidelines:

1. Thought: Analyze the task and think about how to approach it.
2. Action: Decide on an action to take based on your thought. Actions can include:
   - Search: Look up information
   - Calculate: Perform calculations
   - Ask: Request clarification from the user
3. Observation: Describe the result of your action.
4. Repeat steps 1-3 until you have enough information to provide a final answer.
5. Final Answer: Provide the solution to the task.

Always maintain a friendly and helpful demeanor. If you don't know the answer to a question, honestly admit it and explain what information you would need to provide an answer.

Current conversation:
{chat_history}
"""

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
Status: No status
Development

No branches or pull requests

3 participants