Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fixed HPC HP table bug which hid OC20 LR decay factor rule #495

Merged
merged 1 commit into from
Jul 13, 2022
Merged
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion hpc_training_rules.adoc
Original file line number Diff line number Diff line change
Expand Up @@ -99,7 +99,7 @@ Allowed hyperparameter and optimizer settings are specified here. For anything n
|OpenCatalyst |opt_base_learning_rate |`value > 0` |the base learning rate |config setting `lr_initial`
|OpenCatalyst |opt_learning_rate_warmup_steps |`value >= 0` |the number of steps for learning rate to warm up to base value |`warmup_steps`
|OpenCatalyst |opt_learning_rate_warmup_factor |`0 <= value <= 1` |the factor applied to the learning rate at the start of warmup |`warmup_factor`
|OpenCatalyst |opt_learning_rate_decay_boundary_steps |list of positive integers |`lr_milestones`
|OpenCatalyst |opt_learning_rate_decay_boundary_steps |list of positive integers |The steps at which learning rate is decayed |`lr_milestones`
|OpenCatalyst |opt_learning_rate_decay_factor |`0 <= value <= 1` |the factor applied to decay the learning rate at each decay boundary step |`lr_gamma`
|===

Expand Down