Skip to content

Conversation

deejlucas
Copy link
Contributor

Issue #, if available: addresses #1211

Description of changes: The various inflated beta distribution classes used F.logical_or to mask ones and zeros (for which the loss function is undefined), but there is no logical_or method for mxnet.symbol.

We now use F.broadcast_logical_or which exists as a method of both mxnet.ndarray and mxnet.symbol, either of which can be imported as F.

By submitting this pull request, I confirm that you can use, modify, copy, and redistribute this contribution, under the terms of your choice.

@lostella
Copy link
Contributor

Nice! Did you verify that this works? You could also add a test, that just sets up an estimator with the inflated distribution output, and trains it for one batch on some dummy data in [0, 1], with hybridize both True/False

@deejlucas
Copy link
Contributor Author

@lostella I checked that it works. I'll add a test too and push another commit

@lostella lostella added this to the v0.7 milestone Dec 17, 2020
@deejlucas
Copy link
Contributor Author

@lostella I had my dev setup wrong when I made my earlier commits, so not all checks were being made. Now that I've corrected this, I'm getting the following:

src/gluonts/model/deepstate/issm.py:109: error: NotImplemented? not callable
src/gluonts/model/deepstate/issm.py:112: error: NotImplemented? not callable
src/gluonts/model/deepstate/issm.py:115: error: NotImplemented? not callable
src/gluonts/model/deepstate/issm.py:118: error: NotImplemented? not callable
src/gluonts/model/deepstate/issm.py:121: error: NotImplemented? not callable
Found 5 errors in 1 file (checked 204 source files)

It would be easy enough to replace these with NotImplementedError, but it looks like they've been this way for a while, so I'm wondering if I have something configured incorrectly here. I've changed them for now, just so that I can get the commit in, but please let me know if they should be reverted.

Copy link
Contributor

@lostella lostella left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@deejlucas thank you! I have some comments on the test, see inline.

The changes NotImplemented -> NotImplementeError can stay, as a matter of fact I was in the process of doing the same as part of #1223

The various inflated beta classes used F.logical_or to mask ones
and zeros, but this function does not exist for mxnet.symbol.

We now use broadcast_logical_or which exists as a method of both
mxnet.ndarray and mxnet.symbol
@deejlucas
Copy link
Contributor Author

@lostella Thank you for the feedback. I've addressed it. Ready for re-review

Copy link
Contributor

@lostella lostella left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks good! Thanks for spotting this and fixing it!

@lostella lostella merged commit 785fca6 into awslabs:master Dec 18, 2020
kashif pushed a commit to kashif/gluon-ts that referenced this pull request Dec 21, 2020
The various inflated beta classes used F.logical_or to mask ones
and zeros, but this function does not exist for mxnet.symbol.

We now use broadcast_logical_or which exists as a method of both
mxnet.ndarray and mxnet.symbol
lostella pushed a commit to lostella/gluonts that referenced this pull request Feb 9, 2021
The various inflated beta classes used F.logical_or to mask ones
and zeros, but this function does not exist for mxnet.symbol.

We now use broadcast_logical_or which exists as a method of both
mxnet.ndarray and mxnet.symbol
@lostella lostella mentioned this pull request Feb 9, 2021
lostella added a commit that referenced this pull request Feb 10, 2021
* Use broadcast_logical_or in inflated_beta (#1226)
* Fix type error for using quantile_weights and add a proper Pytest (#1231)
* Fixed MASE in N-BEATS: removed redundant factor (#1288)
* Fixing bug where dropout was not used, also remove unused halt option (#1315)
* Fixes for Python 3.8 (#1318)

Co-authored-by: Dan Lucas <[email protected]>
Co-authored-by: youngsuk0723 <[email protected]>
Co-authored-by: Danielle Robinson <[email protected]>
Co-authored-by: Riccardo Grazzi <[email protected]>
Co-authored-by: David Salinas <[email protected]>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants