From 94ceb0936cb090db3eabc8ad7e456f6b0624b6a4 Mon Sep 17 00:00:00 2001 From: Steve Farrell Date: Mon, 27 Jun 2022 13:40:43 -0700 Subject: [PATCH] Fixed HPC HP table bug A bug in the markdown from a missing cell resulted in the oc20 `opt_learning_rate_decay_factor` rule being hidden when rendered. This adds the cell which was missing on the previous row, the text description of the `opt_learning_rate_decay_boundary_steps` parameter. --- hpc_training_rules.adoc | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/hpc_training_rules.adoc b/hpc_training_rules.adoc index 434aed4..0d7414d 100644 --- a/hpc_training_rules.adoc +++ b/hpc_training_rules.adoc @@ -99,7 +99,7 @@ Allowed hyperparameter and optimizer settings are specified here. For anything n |OpenCatalyst |opt_base_learning_rate |`value > 0` |the base learning rate |config setting `lr_initial` |OpenCatalyst |opt_learning_rate_warmup_steps |`value >= 0` |the number of steps for learning rate to warm up to base value |`warmup_steps` |OpenCatalyst |opt_learning_rate_warmup_factor |`0 <= value <= 1` |the factor applied to the learning rate at the start of warmup |`warmup_factor` - |OpenCatalyst |opt_learning_rate_decay_boundary_steps |list of positive integers |`lr_milestones` + |OpenCatalyst |opt_learning_rate_decay_boundary_steps |list of positive integers |The steps at which learning rate is decayed |`lr_milestones` |OpenCatalyst |opt_learning_rate_decay_factor |`0 <= value <= 1` |the factor applied to decay the learning rate at each decay boundary step |`lr_gamma` |===