Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[QST] Where is FlashAttention-2 CUTLASS kernel #1838

Closed
yoon5862 opened this issue Sep 25, 2024 · 6 comments
Closed

[QST] Where is FlashAttention-2 CUTLASS kernel #1838

yoon5862 opened this issue Sep 25, 2024 · 6 comments

Comments

@yoon5862
Copy link

Hello, I'am study fused_multi_head_attention example in CUTLASS.
In CUTLASS 3.5.1 README.md, it said flash attention 2 kernel is in CUTLASS.
But in fused_multi_head attention, it is based on Meta/xFormer.
I can not find flash attention2 CUTLASS kernels.
Is fused_multi_head attention and flash attention is same?

@thakkarV
Copy link
Collaborator

thakkarV commented Sep 25, 2024

xFormers is different from FA2 and FA3. FA2 and FA3 are downstream of CUTLASS in @tridao 's FA repo itself : https://github.com/Dao-AILab/flash-attention. Both FA2 and FA3 are written using CUTLASS.

@yoon5862
Copy link
Author

Thank you for reply.
flash attention concentrate at A100, and H100 kernels.
I'm curious flash attention kernel is efficient with Jetson AGX or RTX series.
If it is not efficient, it need to be tuned for inference efficiently?

Thank you.

@thakkarV
Copy link
Collaborator

FA2 should work quite well on all Sm8x GPUs which includes RTX 3000 and RTX 4000 series GPUs. I suspect it works well on Jetson Orin too since that is Sm8x as well. YMMW so you should benchmark to confirm. If it is not near peak util, it should be quite easy to tune. Although for inference I suspect you want flash decode instead?

@yoon5862
Copy link
Author

Thank you for reply.
I will do it in Sm8x series GPUs.

Copy link

This issue has been labeled inactive-30d due to no recent activity in the past 30 days. Please close this issue if no further response or action is needed. Otherwise, please respond with a comment indicating any updates or changes to the original issue and/or confirm this issue still needs to be addressed. This issue will be labeled inactive-90d if there is no activity in the next 60 days.

Copy link

This issue has been labeled inactive-90d due to no recent activity in the past 90 days. Please close this issue if no further response or action is needed. Otherwise, please respond with a comment indicating any updates or changes to the original issue and/or confirm this issue still needs to be addressed.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

2 participants