Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[CUDA] Flash attention not supported on P100/V100 GPUs #420

Open
mgiessing opened this issue Jan 31, 2025 · 0 comments
Open

[CUDA] Flash attention not supported on P100/V100 GPUs #420

mgiessing opened this issue Jan 31, 2025 · 0 comments

Comments

@mgiessing
Copy link

mgiessing commented Jan 31, 2025

Flash attention is not supported on GPUs with compute capability <= 7.0 and thus cannot be used on e.g. Pascal or Volta GPUs.

Is it possible to either make this an optional dependency or replace it with e.g. xformers?

Thank you!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant