This repository has been archived by the owner on Sep 19, 2023. It is now read-only.
Support for serving multiple models with a single instance #108
Labels
enhancement
New feature or request
This would provide the ability to serve multiple models, and multiple versions of each model, with a single serving instance.
Details can be seen here: https://www.tensorflow.org/tfx/serving/serving_config#model_server_configuration
In practice, this could be achieved by providing a
--model-config
option that could be used instead of the--model
argument, like below:Then when calling 'tensorflow_model_server' in the opennmt-tf docker image entrypoint.py, the argument
--model-config-file
could be used instead of--model-name
and--model-base-path
.The text was updated successfully, but these errors were encountered: