Replies: 1 comment
-
Have decided to just use java onnx-runtime, closing discussion |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
I can load an .onnx model that I have on file in Android app, but am confused about how to run inference. The
predictRaw
method has parameterFloatArray
. Does this mean I have to flatten all my inputs into a singleFloatArray
? If so, how should I structure data in the array itself?EDIT: I'm just now seeing that
OnnxInferenceModel
has a private variable namedinputShape
which needs to be initialized. Even then still not sure how to structure in the inputFloatArray
.In general with onnxruntime I'm used to building a map of string input names to values, but cannot find a method that takes this type.
Beta Was this translation helpful? Give feedback.
All reactions