-
Notifications
You must be signed in to change notification settings - Fork 50
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
feat(construct): lambda layer #20
Conversation
Remove backup drawing
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM, have some questions and feedback but nothing MAJOR.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
why are we calling this as base class. this seems to be specific to QA base adapters. will this class expected to add summary base function as well? or should we rename it something like convestationalChatBase (specific to chat Q&A).
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This implementation almost comes as is from the gen ai chatbot. It supports only chat for now, but we will expand it to other workflows like summarization. This will happen in a following PR, where those parts might be redesigned
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
the package name seems to be too lengthy? may be we can remove python/ genai_core from it. By deafult all of our lambda is in python so may be we can skip it from package name and genai_core kind of confusing with common/helpers. in my view layers/langchain-common/adapters,utils make more sense.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
the package is built as a lambda layer, thus it requires the code structure being under python/ . Then, code is under a directory to make imports cleaner
callbacks=[self.callback_handler], | ||
) | ||
|
||
def get_prompt(self): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
In order to make it open for other modality prompts should we rename this method as get_chat_prompt(). Tomorrow in future we may add other prompts as well for this LLM. thoughts ?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yes in next version, this will be redesigned to address other workflows. One feedback was also to provide custom prompts. See comment above
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Are we planning to implement all the LLM (cohere,AI21,claude,llama,titan) use-cases as well, or this is just for reference and if anyone want to use they can use the adapter? Does this have any impact on layer size ? we have limit on lambda size with layer.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Those adapters are the glue code for interacting with the model, they are lightweight and don't really have an impact on the lambda layer size. Users can inherit the base adapter and build their own if needed. We might add a few more depending on feedback
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM
Fixes #
Merging lambda layer construct to main
Tested in dev account - end to end test
What is this ?
This merge provides:
Why does it matter ?
This construct will be used by other constructs with lambda functions
What is next ?
By submitting this pull request, I confirm that you can use, modify, copy, and redistribute this contribution, under the terms of the project license.