Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat(construct): lambda layer #20

Merged
merged 9 commits into from
Oct 11, 2023
Merged

feat(construct): lambda layer #20

merged 9 commits into from
Oct 11, 2023

Conversation

krokoko
Copy link
Collaborator

@krokoko krokoko commented Oct 10, 2023

Fixes #

Merging lambda layer construct to main
Tested in dev account - end to end test

What is this ?

This merge provides:

  • The lambda layer construct, dependency + library
  • Updates to documentation

Why does it matter ?

This construct will be used by other constructs with lambda functions

What is next ?

  • Adding tests, fix bugs, updating documentation, security review on permissions, demo stack

By submitting this pull request, I confirm that you can use, modify, copy, and redistribute this contribution, under the terms of the project license.

@scottschreckengaust
Copy link
Collaborator

Remove backup drawing

$emerging_tech_cdk_constructs.drawio.bkp

Copy link
Contributor

@dineshSajwan dineshSajwan left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM, have some questions and feedback but nothing MAJOR.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

why are we calling this as base class. this seems to be specific to QA base adapters. will this class expected to add summary base function as well? or should we rename it something like convestationalChatBase (specific to chat Q&A).

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This implementation almost comes as is from the gen ai chatbot. It supports only chat for now, but we will expand it to other workflows like summarization. This will happen in a following PR, where those parts might be redesigned

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

the package name seems to be too lengthy? may be we can remove python/ genai_core from it. By deafult all of our lambda is in python so may be we can skip it from package name and genai_core kind of confusing with common/helpers. in my view layers/langchain-common/adapters,utils make more sense.

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

the package is built as a lambda layer, thus it requires the code structure being under python/ . Then, code is under a directory to make imports cleaner

callbacks=[self.callback_handler],
)

def get_prompt(self):
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

In order to make it open for other modality prompts should we rename this method as get_chat_prompt(). Tomorrow in future we may add other prompts as well for this LLM. thoughts ?

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes in next version, this will be redesigned to address other workflows. One feedback was also to provide custom prompts. See comment above

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Are we planning to implement all the LLM (cohere,AI21,claude,llama,titan) use-cases as well, or this is just for reference and if anyone want to use they can use the adapter? Does this have any impact on layer size ? we have limit on lambda size with layer.

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Those adapters are the glue code for interacting with the model, they are lightweight and don't really have an impact on the lambda layer size. Users can inherit the base adapter and build their own if needed. We might add a few more depending on feedback

Copy link
Collaborator

@scottschreckengaust scottschreckengaust left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

@krokoko krokoko merged commit fda5e13 into main Oct 11, 2023
7 checks passed
@krokoko krokoko deleted the feature/lambda_layer branch October 11, 2023 18:11
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants