Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Doc] Flashinfer has supported the sm75 CUDA device. Can MLC_LLM compile and install on 2080Ti with set(USE_FLASHINFER ON)? #2925

Open
ZanePoe opened this issue Sep 20, 2024 · 1 comment
Labels
documentation Improvements or additions to documentation

Comments

@ZanePoe
Copy link

ZanePoe commented Sep 20, 2024

📚 Documentation

Suggestion

Flashinfer has supported the sm75 CUDA device. but your doc “If you are using CUDA and your compute capability is above 80, then it is require to build with set(USE_FLASHINFER ON).”

Can MLC_LLM latest version compile and install on 2080Ti with set(USE_FLASHINFER ON).

@ZanePoe ZanePoe added the documentation Improvements or additions to documentation label Sep 20, 2024
@MasterJH5574
Copy link
Member

Thank you @ZanePoe for the request. We will bump the FlashInfer version.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
documentation Improvements or additions to documentation
Projects
None yet
Development

No branches or pull requests

2 participants