File tree Expand file tree Collapse file tree 1 file changed +4
-3
lines changed
keras/src/optimizers/schedules Expand file tree Collapse file tree 1 file changed +4
-3
lines changed Original file line number Diff line number Diff line change @@ -584,9 +584,10 @@ class CosineDecay(LearningRateSchedule):
584584 schedule applies a linear increase per optimizer step to our learning rate
585585 from `initial_learning_rate` to `warmup_target` for a duration of
586586 `warmup_steps`. Afterwards, it applies a cosine decay function taking our
587- learning rate from `warmup_target` to `warmup_target * alpha` for a duration of
588- `decay_steps`. If `warmup_target` is None we skip warmup and our decay
589- will take our learning rate from `initial_learning_rate` to `initial_learning_rate * alpha`.
587+ learning rate from `warmup_target` to `warmup_target * alpha` for a
588+ duration of `decay_steps`. If `warmup_target` is None we skip warmup and
589+ our decay will take our learning rate from `initial_learning_rate` to
590+ `initial_learning_rate * alpha`.
590591 It requires a `step` value to compute the learning rate. You can
591592 just pass a backend variable that you increment at each training step.
592593
You can’t perform that action at this time.
0 commit comments