Wrong inference result for onnx model after compiled by tvm

The TVM gave a wrong result for the onnx model . The inference results between tvm and onnx are below, they are different.

image

Script for reproduce this bug:

import onnx
from tvm import relay
from tvm.contrib import graph_runtime
import tvm
import numpy as np

onnx_model_path = "split_model.onnx"
model = onnx.load(onnx_model_path)
irmod, params = relay.frontend.from_onnx(model, {'input': [6]}, freeze_params=True)
graph, lib, params = relay.build(irmod, target='llvm', params=params)

input_data = np.array([1, 2, 3, 4, 5, 6], dtype='float32')


module = graph_runtime.create(graph, lib, tvm.cpu(0))
module.set_input('input', input_data)
module.set_input(**params)
module.run()

outputs_dict = {'output_1': [2], 'output_2': [2], 'output_3': [2]}

res_tvm_list = []
for name, output_shape in outputs_dict.items():
    res_tvm = module.get_output(0, tvm.nd.empty(output_shape)).asnumpy()
    res_tvm_list.append(res_tvm)
print("tvm predict result: ", res_tvm_list)

# ---------------------------------------------------------------------------
import onnxruntime as rt
sess = rt.InferenceSession(onnx_model_path)
res_frame = sess.run(list(outputs_dict.keys()), {'input': input_data})
print("onnx predict result:", res_frame)

Model link:

you can receive the simplest onnx model by this link: simplest_model

Hi all, is it my wrong usage in this script or a tvm bug?

Please help me check out why this error happens

Thanks!