Wrong value when do inference with debug_runtime

I tried to run a model with TVM, when I used graph_runtime I got a similar result with the pytorch.

Then I replaced

from tvm.contrib import graph_rumtime

module = graph_rumtime.GraphModule(lib"default")

with

from tvm.contrib.debugger import debug_runtime as graph_rumtime

module = graph_rumtime.create(lib.get_json(),lib.get_lib(),ctx,“path to dump root”)

and the result is totally wrong.

I tried to check the out tensor value of the first node, and I found that the input node is right and after a “fused_layout_transform_17” node the value of tensor became weird. In my understanding, the transform operation only changes the layout and does not change the value of the tensor. Is that right?