-
Notifications
You must be signed in to change notification settings - Fork 3.6k
Rename variables #124
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Rename variables #124
Conversation
Borda
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think that replacing some abbreviations to full names is better since for example tng is not a standard shortcut and may be misunderstood with tunning... :)
Anyway tests are not passing and some fixes are required
| """ | ||
| Lightning calls this inside the training loop | ||
| :param data_batch: | ||
| :param batch: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
add variable type and description like
:param tuple(ndarray,ndarray) batch: pair of data and desired labels...
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
May as well use the typing module then since this project is 3.6+ anyway, and it's natively supported, plus adding that improves things like autocomplete.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Also, I think that's a separate PR, since this is about renaming, not adding type info.
| return nll | ||
|
|
||
| def training_step(self, data_batch, batch_i): | ||
| def training_step(self, batch, batch_i): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
maybe also bacth_i -> batch_idx of just idx
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
i -> idx is a good improvement
|
|
||
| ``` {.python} | ||
| def training_step(self, data_batch, batch_nb) | ||
| def training_step(self, batch, batch_nb) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
in this case, I would prefer to stay with data_batch of batch_data
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
let's keep these consistent. i think we agreed on batch for x_step
7bb2fb9 to
c4ce347
Compare
|
@alok mind fixing these conflicts so we can merge? we’ll enable this new syntax for version 0.5.0 so there’s a clean version change for potentially annoying refactoring |
- data_batch → batch - batch_i → batch_idx - dataloader_i → dataloader_idx - tng → training - training_dataloader → train_dataloader - add_log_row_interval → row_log_interval - gradient_clip → gradient_clip_val - prog → progress - tqdm_dic → tqdm_dict
No description provided.