Skip to content

Added support for converting large models #1090

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 1 commit into from
Sep 3, 2020

Conversation

TomWildenhain-Microsoft
Copy link
Collaborator

No description provided.

@TomWildenhain-Microsoft TomWildenhain-Microsoft force-pushed the tom/ConvertLargeModels branch 2 times, most recently from 40c1d93 to 4bf08f4 Compare September 2, 2020 19:24
@@ -96,7 +96,19 @@ def inputs_without_resource(sess, input_names):
return input_names


def from_function(func, input_names, output_names):
def from_function(func, input_names, output_names, large_model):
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

maybe use large_model=None since one might call this from some random python code

Copy link
Contributor

@guschmue guschmue Sep 2, 2020

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

other than that looks super lgtm.

@OpUs-Nebula
Copy link

Are there any plans for this support to be added to TF 1.X aswell?

@TomWildenhain-Microsoft
Copy link
Collaborator Author

I believe converting saved models for TF 1.X should already support large models though I haven't tested it. Do you have a specific model you would like to convert? If so, please open a GitHub issue.

@OpUs-Nebula
Copy link

I've tried but i get the error: "ValueError: You passed in an iterable attribute but I cannot figure out its applicable type". It's odd because it seems to compile fine until i get the protobuf size error when i don't pass --saved_model. I do have a specific model in mind. I'm fairly new to ML, so i suppose it could be user error aswell.

@TomWildenhain-Microsoft
Copy link
Collaborator Author

Can you make a new GitHub issue provide the full error output? Issue link:
https://github.com/onnx/tensorflow-onnx/issues/new?template=bug-performance-issue.md

@OpUs-Nebula
Copy link

Just added it

@dgoldenberg-audiomack
Copy link

dgoldenberg-audiomack commented May 25, 2021

If this is fixed, why am I seeing it in TF 2.4.1? see tensorflow/recommenders#274 (comment)

@TomWildenhain-Microsoft
Copy link
Collaborator Author

Hi @dgoldenberg-audiomack, are you trying to convert your model from tensorflow to onnx? This repo is the tf2onnx repo. Our fix is a workaround/hack that lets us convert large TF models. It isn't a fix to tensorflow itself.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants