-
Notifications
You must be signed in to change notification settings - Fork 7.1k
try unflake CI #7627
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
try unflake CI #7627
Changes from all commits
File filter
Filter by extension
Conversations
Jump to
Diff view
Diff view
There are no files selected for viewing
Original file line number | Diff line number | Diff line change | ||||||||||||||||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
|
@@ -354,24 +354,34 @@ def __init__( | |||||||||||||||||||||||||
v2_transforms.ElasticTransform, | ||||||||||||||||||||||||||
legacy_transforms.ElasticTransform, | ||||||||||||||||||||||||||
[ | ||||||||||||||||||||||||||
ArgsKwargs(), | ||||||||||||||||||||||||||
ArgsKwargs(alpha=20.0), | ||||||||||||||||||||||||||
ArgsKwargs(alpha=(15.3, 27.2)), | ||||||||||||||||||||||||||
ArgsKwargs(sigma=3.0), | ||||||||||||||||||||||||||
ArgsKwargs(sigma=(2.5, 3.9)), | ||||||||||||||||||||||||||
ArgsKwargs(interpolation=v2_transforms.InterpolationMode.NEAREST), | ||||||||||||||||||||||||||
ArgsKwargs(interpolation=v2_transforms.InterpolationMode.BICUBIC), | ||||||||||||||||||||||||||
ArgsKwargs(interpolation=PIL.Image.NEAREST), | ||||||||||||||||||||||||||
ArgsKwargs(interpolation=PIL.Image.BICUBIC), | ||||||||||||||||||||||||||
ArgsKwargs(fill=1), | ||||||||||||||||||||||||||
*extra_args_kwargs, | ||||||||||||||||||||||||||
], | ||||||||||||||||||||||||||
# ElasticTransform needs larger images to avoid the needed internal padding being larger than the actual image | ||||||||||||||||||||||||||
make_images_kwargs=dict(DEFAULT_MAKE_IMAGES_KWARGS, sizes=[(163, 163), (72, 333), (313, 95)], dtypes=[dt]), | ||||||||||||||||||||||||||
# We updated gaussian blur kernel generation with a faster and numerically more stable version | ||||||||||||||||||||||||||
# This brings float32 accumulation visible in elastic transform -> we need to relax consistency tolerance | ||||||||||||||||||||||||||
closeness_kwargs=ckw, | ||||||||||||||||||||||||||
) | ||||||||||||||||||||||||||
for dt, ckw in [(torch.uint8, {"rtol": 1e-1, "atol": 1}), (torch.float32, {"rtol": 1e-2, "atol": 1e-3})] | ||||||||||||||||||||||||||
for dt, ckw, extra_args_kwargs in [ | ||||||||||||||||||||||||||
(torch.uint8, {"rtol": 1e-1, "atol": 1}, []), | ||||||||||||||||||||||||||
( | ||||||||||||||||||||||||||
torch.float32, | ||||||||||||||||||||||||||
{"rtol": 1e-2, "atol": 1e-3}, | ||||||||||||||||||||||||||
[ | ||||||||||||||||||||||||||
# These proved to be flaky on uint8 inputs so we only run them on float32 | ||||||||||||||||||||||||||
ArgsKwargs(), | ||||||||||||||||||||||||||
ArgsKwargs(interpolation=v2_transforms.InterpolationMode.BICUBIC), | ||||||||||||||||||||||||||
ArgsKwargs(interpolation=PIL.Image.BICUBIC), | ||||||||||||||||||||||||||
Comment on lines
+379
to
+381
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. It seems that with default
The errors are likely related to some uint8 issue since we have a difference of 255:
This is not an issue with the testing function though: >>> zeros = torch.zeros(3, dtype=torch.uint8)
>>> ones = torch.ones_like(zeros)
>>> ones - zeros
tensor([1, 1, 1], dtype=torch.uint8)
>>> torch.testing.assert_close(ones, zeros, atol=1, rtol=0)
>>> zeros - ones
tensor([255, 255, 255], dtype=torch.uint8)
>>> torch.testing.assert_close(zeros, ones, atol=1, rtol=0) There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. It's hard to decipher what's going on just looking at the logs.
Is that with bicubic or bilinear? float or ints?
WDYM flaky? For bicubic we didn't change anythign from v1 to v2. If there are 0-255 differences (overflows?) from v1 to v2 with bilinear that's potentially concerning There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more.
Yes, it is hard and no I'm not happy with it. Please remember that this never was meant to stay longer than the initial release. Here is the explanation: The
I probably don't understand. No snark answer: error pops up from time to time, but not consistently. Could you rephrase?
They are not new and have been popping up since at least late last year. This is why we have a comment like vision/test/test_transforms_v2_consistency.py Lines 370 to 371 in 4a51822
Since we are talking about a single pixel, I don't think that is much of a concern. There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Basically I find it concerning that v1 differs significantly from v2 on certain interpolation modes. Those modes are controlled by torch core, not by a v1 vs v2 implementation difference (except for bilinear mode which we changed recently, but according to your comment those problems are pre-dating that change) |
||||||||||||||||||||||||||
], | ||||||||||||||||||||||||||
), | ||||||||||||||||||||||||||
] | ||||||||||||||||||||||||||
], | ||||||||||||||||||||||||||
ConsistencyConfig( | ||||||||||||||||||||||||||
v2_transforms.GaussianBlur, | ||||||||||||||||||||||||||
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
why set rtol to 1e-1 and not 0?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Just copied it over from what we had. Came from #6762 that @vfdev-5 authored. Maybe he remembers? Agree that
0
would be better here.Uh oh!
There was an error while loading. Please reload this page.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think the reason is explained in the above comment:
and also in the PR:
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Meaning, we are expecting differences > 1?