Skip to content

Conversation

rasbt
Copy link
Owner

@rasbt rasbt commented Sep 14, 2025

Scales forward in LoRA by rank to make LoRA hparams independent of learning rate.

Fixes #821

Copy link

Check out this pull request on  ReviewNB

See visual diffs & provide feedback on Jupyter Notebooks.


Powered by ReviewNB

@rasbt rasbt merged commit 8f3e5b0 into main Sep 14, 2025
12 checks passed
@rasbt rasbt deleted the lora branch September 14, 2025 16:57
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Bug[LoRA]: The rank is missing in the scaling factor of LoRA. The factor should be alpha/rank instead of alpha
1 participant