Skip to content

Commit

Permalink
make distributed true for multiple process (#997)
Browse files Browse the repository at this point in the history
* make distributed true for multiple process

* Update trl/trainer/ppo_trainer.py

distributed should have more than 1 process

Co-authored-by: Younes Belkada <[email protected]>

---------

Co-authored-by: Younes Belkada <[email protected]>
  • Loading branch information
allanj and younesbelkada authored Nov 15, 2023
1 parent e23a541 commit e140d22
Showing 1 changed file with 1 addition and 1 deletion.
2 changes: 1 addition & 1 deletion trl/trainer/ppo_trainer.py
Original file line number Diff line number Diff line change
Expand Up @@ -330,7 +330,7 @@ def __init__(
# In a distributed setup, only logging needs to be performed on the main process
# check: https://pytorch.org/docs/stable/generated/torch.nn.parallel.DistributedDataParallel.html
# or: https://discuss.pytorch.org/t/use-distributed-data-parallel-correctly/82500/11
self.is_distributed = self.accelerator.distributed_type == "MULTI_GPU"
self.is_distributed = self.accelerator.num_processes > 1

# init the current step
self.current_step = 0
Expand Down

0 comments on commit e140d22

Please sign in to comment.