Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Python runtime does not work with OpenAI compatible endpoints #129

Open
injeniero opened this issue Nov 11, 2024 · 6 comments
Open

Python runtime does not work with OpenAI compatible endpoints #129

injeniero opened this issue Nov 11, 2024 · 6 comments
Assignees
Labels
bug Something isn't working

Comments

@injeniero
Copy link

I have been debugging it and it seems to be broken. These are the issues I found:

@injeniero
Copy link
Author

cc @sethjuarez We are using prompty pointing to Groq, and it works fine in the VS Extension, but it doesnt work with python runtime.

@injeniero
Copy link
Author

I got it work with all these changes:

diff --git a/runtime/prompty/prompty/cli.py b/runtime/prompty/prompty/cli.py
index 3f449d8..10e7862 100644
--- a/runtime/prompty/prompty/cli.py
+++ b/runtime/prompty/prompty/cli.py
@@ -32,6 +32,8 @@ def dynamic_import(module: str):
         t = "prompty.azure"
     elif module == "serverless":
         t = "prompty.serverless"
+    elif module == "openai":
+        t = "prompty.openai"
     else:
         t = module
diff --git a/runtime/prompty/prompty/openai/__init__.py b/runtime/prompty/prompty/openai/__init__.py
index 57607a4..d7cd6cf 100644
--- a/runtime/prompty/prompty/openai/__init__.py
+++ b/runtime/prompty/prompty/openai/__init__.py
@@ -4,7 +4,7 @@ from prompty.invoker import InvokerException
 try:
     from .executor import OpenAIExecutor
     from .processor import OpenAIProcessor
-except ImportError:
+except ImportError as e:
     raise InvokerException(
-        "Error registering OpenAIExecutor and OpenAIProcessor", "openai"
+        f"Error registering OpenAIExecutor and OpenAIProcessor: {e}", "openai"
     )
diff --git a/runtime/prompty/prompty/openai/executor.py b/runtime/prompty/prompty/openai/executor.py
index 1b8f79a..003472c 100644
--- a/runtime/prompty/prompty/openai/executor.py
+++ b/runtime/prompty/prompty/openai/executor.py
@@ -18,12 +18,12 @@ class OpenAIExecutor(Invoker):
         self.kwargs = {
             key: value
             for key, value in self.prompty.model.configuration.items()
-            if key != "type"
+            if key != "type" and key != "name"
         }
 
         self.api = self.prompty.model.api
-        self.deployment = self.prompty.model.configuration["azure_deployment"]
         self.parameters = self.prompty.model.parameters
+        self.model = self.prompty.model.configuration["name"]
 
     def invoke(self, data: any) -> any:
         """Invoke the OpenAI API
@@ -59,7 +59,7 @@ class OpenAIExecutor(Invoker):
             if self.api == "chat":
                 trace("signature", "OpenAI.chat.completions.create")
                 args = {
-                    "model": self.deployment,
+                    "model": self.model,
                     "messages": data if isinstance(data, list) else [data],
                     **self.parameters,
                 }
diff --git a/runtime/prompty/prompty/utils.py b/runtime/prompty/prompty/utils.py
index 2935b87..8477329 100644
--- a/runtime/prompty/prompty/utils.py
+++ b/runtime/prompty/prompty/utils.py
@@ -29,7 +29,7 @@ async def load_json_async(file_path, encoding='utf-8'):
     return json.loads(content)
 
 def _find_global_config(prompty_path: Path = Path.cwd()) -> Path:
-    prompty_config = list(Path.cwd().glob("**/prompty.json"))
+    prompty_config = list(prompty_path.glob("**/prompty.json"))
 
     if len(prompty_config) > 0:
         return sorted(

@sethjuarez
Copy link
Member

OOF - I need to fix this!

@sethjuarez sethjuarez self-assigned this Nov 13, 2024
@sethjuarez sethjuarez added the bug Something isn't working label Nov 13, 2024
@InTheCloudDan
Copy link

InTheCloudDan commented Nov 17, 2024

I'm running into a similar problem trying to use headless and an OpenAI endpoint. I tried to make some of the changes suggested here but running into:

  File "/Users/daniel/tempProj/prompty-example/main.py", line 13, in <module>
    prompt = prompty.headless(
  File "/Users/daniel/Library/Python/3.10/lib/python/site-packages/prompty/tracer.py", line 155, in wrapper
    inputs = _inputs(func, args, kwargs)
  File "/Users/daniel/Library/Python/3.10/lib/python/site-packages/prompty/tracer.py", line 118, in _inputs
    ba = inspect.signature(func).bind(*args, **kwargs)
  File "/Users/daniel/.pyenv/versions/3.10.13/Library/Frameworks/Python.framework/Versions/3.10/lib/python3.10/inspect.py", line 3186, in bind
    return self._bind(args, kwargs)
  File "/Users/daniel/.pyenv/versions/3.10.13/Library/Frameworks/Python.framework/Versions/3.10/lib/python3.10/inspect.py", line 3175, in _bind
    raise TypeError(
TypeError: got an unexpected keyword argument 'model'

@injeniero
Copy link
Author

I saw that error as well, it was fixed with all the changes I made to prompty/openai/executor.py, please double check.

@InTheCloudDan
Copy link

@injeniero thanks, going through it again and fixing it up did solve it!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

3 participants