Replies: 3 comments 1 reply
-
Hey there @lalyko have a look at the internal details of hayhooks for inspiration. Not 100% sure what exactly you want to do but have a look at https://github.com/deepset-ai/hayhooks/blob/main/src/hayhooks/server/app.py and how it uses deploy_pipeline_def method to deploy your pipeline and hook run invocation there. |
Beta Was this translation helpful? Give feedback.
-
Hey @vblagoje , do you have an example of hayhooks with realistic invocation parameters, like question and messages of my pipeline run example? |
Beta Was this translation helpful? Give feedback.
-
Hey @vblagoje , thank for your response. The notebook uses NIMs, but not hayhook. ... that wasn't my question. I ask for hayhook examples and not haystack example in general. I know how to build pipelines by phython scripts. But when I convert such a script by dump(), the content of the run() method is not represented in the yml file (of cause not). So having look at the apify_haystack_rag. There you find the run - invocation:
For hayhook, I know, the 'question' has to be a request parameter and the pipeline run() method will be executed inside the hayhook service. But what's about the instruction inside the run() method:
How does hayhook handle this? How can I intsruct the service (hayhook) to handle my request as I defined it in my python script (py)? And how can I provide more than one parameter? Is there an example for this? |
Beta Was this translation helpful? Give feedback.
-
By trying to provide a rag pipeline as rest service by hayhooks. Therfore you to dump my python file to yml. But if you do so, the pipeline run process ist missing. How can I provide a custom run invocation by hayhooks like this:
rag_pipeline.run(
{
"retriever": {"query": question},
"prompt_builder": {
"template_variables": { },
"prompt_source": messages,
"query": question }
}
The prompt message templates are missing also!
Beta Was this translation helpful? Give feedback.
All reactions