Hi all,
I am trying to run inference for onnx model. I have read the tutorial “Compile ONNX Models”, but in that tutorial, only one input is needed.
tvm_output = intrp.evaluate()(tvm.nd.array(x.astype(dtype)), **params).asnumpy()
If I need two inputs, how should I feed them into the network?