Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

implement a non-Cuda flash attention module #33

Open
by321 opened this issue May 14, 2023 · 1 comment
Open

implement a non-Cuda flash attention module #33

by321 opened this issue May 14, 2023 · 1 comment

Comments

@by321
Copy link

by321 commented May 14, 2023

The current flash attention module by Hazy Research is Cuda-only, so this limits this repo to Cuda-only too. I suggest writing a separate flash attention module for computers without Nvidia video card. This current module can still be used if Nvidia card is present.

A couple of people have raised this issue with Hazy Research, but they said they're focused on Cuda only, and are not interested in writing a non-Cuda version.

@ayushtues
Copy link
Contributor

Added a pr #37 to avoid using flash Attention

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants