Skip to content

tenstorrent/tt-inference-server

Repository files navigation

TT-Inference-Server

Tenstorrent Inference Server (tt-inference-server) is the repo of available model APIs for deploying on Tenstorrent hardware.

Official Repository

https://github.com/tenstorrent/tt-inference-server

Getting Started

Please follow setup instructions found in each model folder's README.md doc


Model Implementations

Model Hardware
LLaMa 3.1 70B TT-QuietBox & TT-LoudBox
Mistral 7B n150 and n300

About

No description, website, or topics provided.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages