Could one endpoint provides multiple ML services? #4277
Replies: 2 comments 1 reply
-
Closed this because of duplication |
Beta Was this translation helpful? Give feedback.
0 replies
-
Hey any updates on this? |
Beta Was this translation helpful? Give feedback.
1 reply
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Hello!
I want to deploy multiple machine learning models on BentoML, but I want to deploy them to the same endpoint.
That is, a endpoint provides multiple independent ML services.
For example,I want to deploy model1 and model2 on the sever, model1 use the "localhost:5000/model1", and model2 use the "localhost:5000/modelA2" .
As far as I know, I could make model1 use the "localhost:5000/model1" and make model2 use the"localhost:5001/model2 ", but that's not what I want.
Can BentoML do this?
Thanks for the answer!
Beta Was this translation helpful? Give feedback.
All reactions