ChatTTS is a text-to-speech model designed specifically for dialogue scenario such as LLM assistant. This example demonstrates how to serve ChatTTS with BentoML.
See here for a full list of BentoML example projects.
git clone https://github.com/bentoml/BentoChatTTS.git
cd BentoChatTTS
# Recommend Python 3.11
pip install bentoml
pip install -r requirements.txt
If not already present, you need to install libsox-dev
with your package manager first.
Start a Bento server with ChatTTS.
export CHAT_TTS_REPO=https://github.com/2noise/ChatTTS.git
bentoml serve
The server is now active at http://localhost:3000. You can interact with it using the Swagger UI.
You can deploy the ChatTTS Bento service to BentoCloud.
Sign up if you haven't got a BentoCloud account.
Make sure you have logged in to BentoCloud, then run the following command to deploy it.
bentoml deploy .