Attention details about the func "eq_scaled_dot_product_attention" #796
Unanswered
MaXiaoSIOM
asked this question in
Q&A
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
I am wondering about the difference between the customized attention "eq_scaled_dot_product_attention" and the torch func “F.scaled_dot_product_attention”? Can I use the latter overall?
Beta Was this translation helpful? Give feedback.
All reactions