Skip to content

Commit

Permalink
Change default dropout value in documentation
Browse files Browse the repository at this point in the history
Documentation says default is 0.1, but the code has attention_dropout default at 0.0
  • Loading branch information
jamaliki authored Jan 13, 2023
1 parent d509832 commit 41cb909
Showing 1 changed file with 1 addition and 1 deletion.
2 changes: 1 addition & 1 deletion flash_attn/flash_attention.py
Original file line number Diff line number Diff line change
Expand Up @@ -16,7 +16,7 @@ class FlashAttention(nn.Module):
(default: 1/sqrt(d_keys) where d_keys is computed at
runtime)
attention_dropout: The dropout rate to apply to the attention
(default: 0.1)
(default: 0.0)
"""
def __init__(self, softmax_scale=None, attention_dropout=0.0, device=None, dtype=None):
super().__init__()
Expand Down

0 comments on commit 41cb909

Please sign in to comment.