Hi everybody,
I have an onnx model that I’ve tested using two following approaches:
Approach 1:
…
module = graph_runtime.create(loaded_json, loaded_lib, ctx)
module.load_params(loaded_params)
module.set_input(input)
module.run()
module.get_output()
…
Approach 2:
…
module, params = relay.frontend.from_onnx( onnx_model, shape_dict)
ex = tvm.relay.create_executor(“graph”, module, tvm.cpu(0), target)
result = ex.evaluate()(input, **params).asnumpy()
…
The problem is that approach 1 works fine but with approach 2, I ends up with the following error:
“Check failed: pval != nullptr == false: Cannot allocate memory symbolic tensor shape [?, ?, ?, ?]”
The onnx model as well as two python scripts for running the model with both approaches is uploaded here. You can use them to reproduce the error.
I’m using this version of TVM: GitHub - gussmith23/tvm at 2021-05-18-fix-byodt-parsing
I would appreciate it if anyone can help me to understand why approached 2 doesn’t work.
Thanks in advance