When I refer to Deploy Pretrained Vision Model from MxNet on VTA,I changed the network to Unet,I want to accelerate the implementation of Unet through PYNQ board. It was fine when you imported the network and finished quantizing, but you encountered an error when you ran graph_pack.My Settings for graph_pack are as follows:
relay_prog = graph_pack(
mod["main"],
env.BATCH,
env.BLOCK_OUT,
env.WGT_WIDTH,
start_name="cast",
stop_name="nn.conv2d",
start_name_idx =8,
stop_name_idx = 299 ,
device_annot=(env.TARGET == "intelfocl"),
)
The errors encountered are as follows:
An error occurred during the execution of TVM.
For more information, please see: https://tvm.apache.org/docs/errors.html
---------------------------------------------------------------
Check failed: src_shape.size() == src_axis.size() (6 vs. 4) :
File "/home/amax/tvm/python/tvm/_ffi/_cython/base.pxi", line 163, in tvm._ffi._cy3.core.CALL
raise get_last_ffi_error()
File "/home/amax/tvm/python/tvm/_ffi/_cython/packed_func.pxi", line 246, in tvm._ffi._cy3.core.FuncCall3
CALL(TVMFuncCall(chandle, &values[0], &tcodes[0],
File "/home/amax/tvm/python/tvm/_ffi/_cython/packed_func.pxi", line 257, in tvm._ffi._cy3.core.FuncCall
FuncCall3(chandle, args, nargs, ret_val, ret_tcode)
File "/home/amax/tvm/python/tvm/_ffi/_cython/packed_func.pxi", line 323, in tvm._ffi._cy3.core.PackedFuncBase.__call__
FuncCall(self.chandle, args, &ret_val, &ret_tcode)
File "/home/amax/tvm/python/tvm/ir/transform.py", line 161, in __call__
return _ffi_transform_api.RunPass(self, mod)
File "/home/amax/tvm/vta/python/vta/top/graphpack.py", line 30, in run_opt_pass
mod = opt_pass(mod)
File "/home/amax/tvm/vta/python/vta/top/graphpack.py", line 611, in graph_pack
expr = run_opt_pass(expr, transform.InferType())
File "/home/amax/zmb/segnet_tvm/deploy_classification.py", line 225, in <module>
relay_prog = graph_pack(
I don’t know how this happened,since the start_name definition was already after the first conv2D convolution. Has anyone experienced this problem or can you help solve it? I’ve been stuck here for a long time. Thank you.