Skip to content

Commit 3ea7ed4

Browse files
Fix line length to comply with 80 char limit
1 parent 93da29e commit 3ea7ed4

File tree

1 file changed

+4
-3
lines changed

1 file changed

+4
-3
lines changed

keras/src/optimizers/schedules/learning_rate_schedule.py

Lines changed: 4 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -584,9 +584,10 @@ class CosineDecay(LearningRateSchedule):
584584
schedule applies a linear increase per optimizer step to our learning rate
585585
from `initial_learning_rate` to `warmup_target` for a duration of
586586
`warmup_steps`. Afterwards, it applies a cosine decay function taking our
587-
learning rate from `warmup_target` to `warmup_target * alpha` for a duration of
588-
`decay_steps`. If `warmup_target` is None we skip warmup and our decay
589-
will take our learning rate from `initial_learning_rate` to `initial_learning_rate * alpha`.
587+
learning rate from `warmup_target` to `warmup_target * alpha` for a
588+
duration of `decay_steps`. If `warmup_target` is None we skip warmup and
589+
our decay will take our learning rate from `initial_learning_rate` to
590+
`initial_learning_rate * alpha`.
590591
It requires a `step` value to compute the learning rate. You can
591592
just pass a backend variable that you increment at each training step.
592593

0 commit comments

Comments
 (0)