Skip to content

Conversation

@Borda
Copy link
Collaborator

@Borda Borda commented Dec 2, 2019

What does this PR do?

PARTIAL Fixes #534.

PR review

Anyone in the community is free to review the PR once the tests have passed.
If we didn't discuss your PR in Github issues there's a high chance it will not be merged.

@Borda
Copy link
Collaborator Author

Borda commented Dec 2, 2019

@elliotwaite could you have look on this (proposed) changes...

@williamFalcon
Copy link
Contributor

@Borda all of these changes need to be backward compatible though

@Borda Borda changed the title rename nb -> num rename nb -> num [wip] Dec 2, 2019
@Borda
Copy link
Collaborator Author

Borda commented Dec 2, 2019

for that, I am not sure how to handle multiple methods with the same name just with changed parameter name... adding the past parameter name will crash while passing parameters with values only... :/

https://stackoverflow.com/questions/5079609/methods-with-the-same-name-in-one-class-in-python

@Borda
Copy link
Collaborator Author

Borda commented Dec 2, 2019

@williamFalcon it seems that method overloading will make it even worse...
https://www.geeksforgeeks.org/python-method-overloading
I do not think that these parameter name changes are big issue for users, but you say so and you want to keep the names, we shall close this PR/change and conclude that #534 won't be implemented...

@williamFalcon
Copy link
Contributor

williamFalcon commented Dec 2, 2019

@Borda
I think we can:

  1. add backward compatibility to trainer arg changes.
  2. the arg name changes to training_step, etc shouldn't affect users (see example)

image

trainer = Trainer(
gpus=2,
nb_gpu_nodes=2,
num_gpu_nodes=2,
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

maybe shorten to:
num_nodes?

default_save_path=hparams.save_path,
gpus=hparams.gpus,
max_nb_epochs=hparams.epochs,
max_num_epochs=hparams.epochs,
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

add backward compatibility

@Borda
Copy link
Collaborator Author

Borda commented Dec 2, 2019

I think we can:

  1. add backward compatibility to trainer arg changes.
  2. the arg name changes to training_step, etc shouldn't affect users (see example)
    image

but this assumes that the class name is different, so class Trainer2(...)?

@williamFalcon
Copy link
Contributor

williamFalcon commented Dec 2, 2019

not sure i follow. class A is lightningModule. B is the subclass

@Borda
Copy link
Collaborator Author

Borda commented Dec 2, 2019

what??? class A is lightningModule. B is the subclass

I just do not understand what you suggest, adding subclass of LightningModule...
you mean to keep both parameters, like you have here?
https://github.com/williamFalcon/pytorch-lightning/blob/89ececb32ba0cfd810737cb90b2285a27332f5d4/pytorch_lightning/trainer/trainer.py#L59

@williamFalcon
Copy link
Contributor

changing

def training_step(self, batch, batch_nb):

to

def training_step(self, batch, batch_idx):

does not affect the user

@Borda Borda changed the title rename nb -> num [wip] rename variables nb -> num Dec 3, 2019
@williamFalcon
Copy link
Contributor

@Borda rebase?

@Borda
Copy link
Collaborator Author

Borda commented Dec 3, 2019

@williamFalcon done :]

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Make all abbreviations: (nb --> num) and (i --> idx)

2 participants