Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fixed LLMCustom setup #418

Merged

Conversation

nicola-corbellini
Copy link
Member

Description

This PR fixes a couple of issues related to using a custom LLM.

Type of change

  • Bug fix (non-breaking change which fixes an issue)
  • New feature (non-breaking change which adds functionality)
  • Breaking change (fix or feature that would cause existing functionality to not work as expected)
  • This change requires a documentation update

Checklist:

  • My code follows the style guidelines of this project
  • I have performed a self-review of my own code
  • I have commented my code, particularly in hard-to-understand areas

@@ -46,7 +46,7 @@ class LLMCustomConfig(LLMSettings):
@classmethod
def get_llm_from_config(cls, config):
# options are inserted as a string in the admin
if type(config["options"]) == str:
if isinstance(config["options"], dict):
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Here we were checking for a string because there is a free text input box in the admin (where people can write a doctionary). That dictionary is later parsed with json.loads

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ah I've misunderstood that, I'm correcting it.

Also, I think I've slipped and branched out main rather than develop. It's better if I open a new PR, isn't it?

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

They should be aligned apart from the readme, don't worry in this case nothing happened

@nicola-corbellini nicola-corbellini marked this pull request as draft August 18, 2023 10:18
@pieroit pieroit marked this pull request as ready for review August 18, 2023 11:26
@pieroit pieroit merged commit 78ec23e into cheshire-cat-ai:develop Aug 18, 2023
2 checks passed
@nicola-corbellini nicola-corbellini deleted the feature/custom_llm_config branch August 18, 2023 13:35
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants