The from_onnx tutorial hosted here: Compile ONNX Models — tvm 0.9.dev182+ge718f5a8a documentation
Relies on a model that generate the following error using onnxruntime code:
from tvm.contrib.download import download_testdata
import onnxruntime as ort
model_url = "".join(
[
"https://gist.github.com/zhreshold/",
"bcda4716699ac97ea44f791c24310193/raw/",
"93672b029103648953c4e5ad3ac3aadf346a4cdc/",
"super_resolution_0.2.onnx",
]
)
model_path = download_testdata(
model_url, "super_resolution.onnx", module="onnx"
)
ort_sess = ort.InferenceSession(model_path)
onnxruntime.capi.onnxruntime_pybind11_state.Fail: [ONNXRuntimeError] : 1 : FAIL : Load model from /home/peter/.tvm_test_data/onnx/super_resolution.onnx failed:/onnxruntime_src/onnxruntime/core/graph/model.cc:107 onnxruntime::Model::Model(onnx::ModelProto&&, const PathString&, const IOnnxRuntimeOpSchemaRegistryList*, const onnxruntime::logging::Logger&) Missing opset in the model. All ModelProtos MUST have at least one entry that specifies which version of the ONNX OperatorSet is being imported.
Therefore the model hosted by @zhreshold should be updated or changed.
You can check the missing opset version using the following code:
import onnx
model_url = "".join(
[
"https://gist.github.com/zhreshold/",
"bcda4716699ac97ea44f791c24310193/raw/",
"93672b029103648953c4e5ad3ac3aadf346a4cdc/",
"super_resolution_0.2.onnx",
]
)
model_path = download_testdata(
model_url, "super_resolution.onnx", module="onnx"
)
onnx_model = onnx.load(model_path)
print(onnx_model.opset_import)
This might not be an issue if the tutorial is only about the API usage but it can be counterintuitive for someone wanting to test out tvm vs onnxruntime.