Skip to content

enforce onnx conversion in CI #3628

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 2 commits into from
Mar 12, 2020
Merged

Conversation

chriselion
Copy link
Contributor

Proposed change(s)

Cherry pick of #3600 to the release branch

Useful links (Github issues, JIRA tickets, ML-Agents forum threads etc.)

Types of change(s)

  • Bug fix
  • New feature
  • Code refactor
  • Breaking change
  • Documentation update
  • Other (please describe)

Checklist

  • Added tests that prove my fix is effective or that my feature works
  • Updated the changelog (if applicable)
  • Updated the documentation (if applicable)
  • Updated the migration guide (if applicable)

Other comments

Copy link
Contributor

@vincentpierre vincentpierre left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The behavior is rather complex and I would like to see a docstring around what is the expected behavior. I think we will always try to convert to ONNX when possible and if we force it, we will raise an error when not possible.
There are 3 flags : ONNX_EXPORT_ENABLED, settings.convert_to_onnx and TEST_ENFORCE_ONNX_CONVERSION and I think they should have a comment associated to distinguish them

@@ -18,6 +21,11 @@
from tensorflow.python.framework import graph_util
from mlagents.trainers import tensorflow_to_barracuda as tf2bc

if LooseVersion(tf.__version__) < LooseVersion("1.12.0"):
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Should we also filter out tf 2.x ?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I don't want to explicitly filter it out. A future version of tf2onnx should support f2 2.x, in which case the import should succeed.

else:
if _enforce_onnx_conversion():
raise RuntimeError(
"ONNX conversion enforced, but couldn't import dependencies."
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is not true, it could also be because the version does not match.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Good point, I'll update the comment here. It's not something a user will ever see (unless they added the environment variable), though.

@vincentpierre
Copy link
Contributor

I realize now, this is for release, not master.
Please merge as is and maybe add comments in another PR.

@chriselion chriselion merged commit 5f01d16 into release-0.15.0 Mar 12, 2020
@delete-merged-branch delete-merged-branch bot deleted the release-0.15.0-onnx-CI branch March 12, 2020 20:38
@github-actions github-actions bot locked as resolved and limited conversation to collaborators May 15, 2021
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants