Hi All.
I am having a problem importing a simple lenet model (MNIST digit classification) with ONNX front end. I am using MNIST digit classification from: https://github.com/onnx/models/blob/master/vision/classification/mnist/model/mnist-1.tar.gz
Here is the snippet of code that I am using. ` def lenet_onnx_model(onnxfile, inputfile, outputfile, dotfile):
# load ONNX model
onnxmodel = onnx.load_model(onnxfile)
input = numpy_helper.to_array(onnx.load_tensor(inputfile))
output = numpy_helper.to_array(onnx.load_tensor(outputfile))
input_name = onnxmodel.graph.input[0].name
shape_dict = {input_name: input.shape}
# convert ONNX model to IR module. THIS LINE gives an error.
mod, params = relay.frontend.from_onnx(onnxmodel, shape_dict)
`
Error when I run “mod, params = relay.frontend.from_onnx(onnxmodel, shape_dict)”. The errror is as follows:
Any suggestions on how to solve this problem?
’ Traceback (most recent call last): File “C:\Program Files\JetBrains\PyCharm 2020.1.2\plugins\python\helpers\pydev_pydevd_bundle\pydevd_exec2.py”, line 3, in Exec exec(exp, global_vars, local_vars) File “”, line 1, in File “C:\repos\tvm23\tvm\python\tvm\relay\frontend\onnx.py”, line 2748, in from_onnx mod, params = g.from_onnx(graph, opset, freeze_params) File “C:\repos\tvm23\tvm\python\tvm\relay\frontend\onnx.py”, line 2555, in from_onnx op = self._convert_operator(op_name, inputs, attr, opset) File “C:\repos\tvm23\tvm\python\tvm\relay\frontend\onnx.py”, line 2663, in _convert_operator sym = convert_map[op_name](inputs, attrs, self._params) File “C:\repos\tvm23\tvm\python\tvm\relay\frontend\onnx.py”, line 268, in _impl_v1 input_shape = infer_shape(data) File “C:\repos\tvm23\tvm\python\tvm\relay\frontend\common.py”, line 501, in infer_shape out_type = infer_type(inputs, mod=mod) File “C:\repos\tvm23\tvm\python\tvm\relay\frontend\common.py”, line 482, in infer_type new_mod = _transform.InferType()(new_mod) File “C:\repos\tvm23\tvm\python\tvm\ir\transform.py”, line 127, in call return _ffi_transform_api.RunPass(self, mod) File “C:\repos\tvm23\tvm\python\tvm_ffi_ctypes\packed_func.py”, line 237, in call raise get_last_ffi_error() tvm._ffi.base.TVMError: Traceback (most recent call last): File “C:\repos\tvm23\tvm\src\relay\analysis\type_solver.cc”, line 622 TVMError:
An internal invariant was violated during the execution of TVM. Please read TVM’s error reporting guidelines. More details can be found here: https://discuss.tvm.ai/t/error-reporting/7793.
Check failed: false == false: [13:50:39] C:\repos\tvm23\tvm\src\relay\op\type_relations.cc:107:
An internal invariant was violated during the execution of TVM. Please read TVM’s error reporting guidelines. More details can be found here: https://discuss.tvm.ai/t/error-reporting/7793.
Check failed: t0->dtype == t1->dtype (int64 vs. int32) :
’