Skip to content

Commit

Permalink
Perf: use fused Adam optimizer
Browse files Browse the repository at this point in the history
  • Loading branch information
caic99 authored Dec 9, 2024
1 parent d162d0b commit 1c8ccf6
Showing 1 changed file with 1 addition and 1 deletion.
2 changes: 1 addition & 1 deletion deepmd/pt/train/training.py
Original file line number Diff line number Diff line change
Expand Up @@ -579,7 +579,7 @@ def warm_up_linear(step, warmup_steps):
# author: iProzd
if self.opt_type == "Adam":
self.optimizer = torch.optim.Adam(
self.wrapper.parameters(), lr=self.lr_exp.start_lr
self.wrapper.parameters(), lr=self.lr_exp.start_lr, fused=True
)
if optimizer_state_dict is not None and self.restart_training:
self.optimizer.load_state_dict(optimizer_state_dict)
Expand Down

0 comments on commit 1c8ccf6

Please sign in to comment.