-
Notifications
You must be signed in to change notification settings - Fork 1k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
ROCm and Vulkan seems like doesn't work #2810
Comments
Docker Container with ROCm works, but that version 0.11.1 |
Hi, could you try the binary distribution (Vulkan) available at https://github.com/TabbyML/tabby/releases/tag/v0.14.0 and let me know if it works for you? If not, could you please provide the following information:
Thank you! |
Without RUST_LOG=debug spams these
With
|
I fixed this for ROCM, it's probably the same for Vulkan: #2835 Caused by a change to the flag names in llama-cpp-server. Try changing:
to
in |
Describe the bug
Tabby ignores the --device setting and always runs on CPU
Information about your version
I am at tag 0.14.0 with cherry-picked commit with Elixir support.
Information about your GPU
Additional context
No errors, nothing. I tried running it on Vulkan as well, and the same thing happens
The text was updated successfully, but these errors were encountered: