Skip to content

Commit

Permalink
Update modeling_llama.py
Browse files Browse the repository at this point in the history
  • Loading branch information
deepcs233 authored Jul 7, 2024
1 parent ae0643d commit 43fc2e9
Showing 1 changed file with 2 additions and 0 deletions.
2 changes: 2 additions & 0 deletions LAVIS/lavis/models/blip2_models/modeling_llama.py
Original file line number Diff line number Diff line change
Expand Up @@ -418,6 +418,8 @@ class LlamaFlashAttention2(LlamaAttention):
untouched. The only required change would be on the forward pass where it needs to correctly call the public API of
flash attention and deal with padding tokens in case the input contains any of them.
"""
def __init__(self, config: LlamaConfig):
super().__init__(config)

def forward(
self,
Expand Down

0 comments on commit 43fc2e9

Please sign in to comment.