Skip to content

Loop op with maximum iterations empty input M causes failure when running symbolic_shape_infer script #12684

Closed
@LoicDagnas

Description

@LoicDagnas

Describe the bug
When converting a TensorFlow model to ONNX using https://github.com/onnx/tensorflow-onnx, the input M of a Loop node can be let empty and it should be supported according to the documentation. However, when running the script onnxruntime.tools.symbolic_shape_infer it fails with such an error:

Traceback (most recent call last):
  File "C:\Users\loic.dagnas\AppData\Local\Programs\Python\Python39\lib\runpy.py", line 197, in _run_module_as_main
    return _run_code(code, main_globals, None,
  File "C:\Users\loic.dagnas\AppData\Local\Programs\Python\Python39\lib\runpy.py", line 87, in _run_code
    exec(code, run_globals)
  File "C:\dev\ml\debug_arnaud\venv\lib\site-packages\onnxruntime\tools\symbolic_shape_infer.py", line 2424, in <module>
    out_mp = SymbolicShapeInference.infer_shapes(
  File "C:\dev\ml\debug_arnaud\venv\lib\site-packages\onnxruntime\tools\symbolic_shape_infer.py", line 2357, in infer_shapes
    all_shapes_inferred = symbolic_shape_inference._infer_impl()
  File "C:\dev\ml\debug_arnaud\venv\lib\site-packages\onnxruntime\tools\symbolic_shape_infer.py", line 2127, in _infer_impl
    self.dispatcher_[node.op_type](node)
  File "C:\dev\ml\debug_arnaud\venv\lib\site-packages\onnxruntime\tools\symbolic_shape_infer.py", line 1066, in _infer_Loop
    si.CopyFrom(self.known_vi_[node.input[i]])
KeyError: ''

It wasn't the case before but since this PR onnx/tensorflow-onnx#1971 on the tf2onnx repo, the input M of Loop can be let empty.

System information

  • OS Platform and Distribution (e.g., Linux Ubuntu 16.04): Window 11
  • ONNX Runtime installed from (source or binary): binary
  • ONNX Runtime version: 11.2
  • Python version: 3.9
  • CUDA/cuDNN version: 11.6

To Reproduce

Running the following command:

python -m onnxruntime.tools.symbolic_shape_infer
    --input path/to/model.onnx
    --output output-model.onnx
    --auto_merge

work using a model converted with tf2onnx==1.11.1 but failed with the same model converted with tf2onnx==1.12.0

Here is the model in ONNX format
https://drive.google.com/file/d/19jOktShuha40ib2CRGSugfhi_kIjZh_p/view?usp=sharing

Expected behavior
Running of the command above should be successful.

Metadata

Metadata

Assignees

No one assigned

    Labels

    converterrelated to ONNX converters

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions