Skip to content

Commit

Permalink
bug fix for using return_layernorm_output=True (#1382)
Browse files Browse the repository at this point in the history
the current implementation would release the output of ln, leading to an error if setting `return_layernorm_output=True`.

Signed-off-by: Liyuan Liu <[email protected]>
Co-authored-by: Tim Moon <[email protected]>
  • Loading branch information
LiyuanLucasLiu and timmoon10 authored Jan 8, 2025
1 parent b898cbe commit 61cf102
Showing 1 changed file with 1 addition and 1 deletion.
2 changes: 1 addition & 1 deletion transformer_engine/pytorch/module/layernorm_mlp.py
Original file line number Diff line number Diff line change
Expand Up @@ -373,7 +373,7 @@ def forward(
ub=ub_obj_lnout if ub_overlap_ag else None,
extra_output_tensor=ln_out if ub_overlap_ag else None,
)
if not is_grad_enabled:
if not is_grad_enabled and not return_layernorm_output:
clear_tensor_data(ln_out_total)

if bias_gelu_nvfusion:
Expand Down

0 comments on commit 61cf102

Please sign in to comment.