-
Notifications
You must be signed in to change notification settings - Fork 7.1k
Fix a bug and document RandomZoomOut #5278
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
💊 CI failures summary and remediationsAs of commit 66bc73a (more details on the Dr. CI page):
1 failure not recognized by patterns:
1 job timed out:
This comment was automatically generated by Dr. CI (expand for details).Please report bugs/suggestions to the (internal) Dr. CI Users group. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks @datumbox
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM, thanks
@xiaohu2015 Sorry for getting ages to answer your question at #3403 (comment). I had to catch up with other things and forgot to check. This extra bit is indeed needed; I've added it to get around the limitation of PyTorch pad, which only accept integers for the |
Summary: Bug fix Reviewed By: kazhang Differential Revision: D33927496 fbshipit-source-id: 86d101f32a1bd3ba23e16f4ab102d1635265cf45
While investigating a user question in relation to the RandomZoomOut transform at #3403 (comment), I noticed that the
torch.rand(1)
check is flipped the other way around. It had no effects on the training of SSD because we usep=0.5
but the bug needs to be fixed.Moreover I've added a comment to explain why an extra bit is necessary on the transform. I think the comment is necessary since I also have forgotten why the extra code is necessary and almost removed it.