How to debug :In particular `Tensor[(?, 1, ?, ?), float32]` does not match `Tensor[(?, ?, ?), float32]`

Hi, I convert a model from PyTorch to onnx and it runs well-using onnxruntime. But when I load and build the onnx model in TVM, I got the following issue: Is there a way I can debug this?

bigtree@ububtu20: python3 run_onnx_on_tvm.py 
shape_dict {'0': [1, 3, 256, 256]}
tensor type `Tensor[(1), bool]` has 1 dimensions, while `bool` has 0 dimensions
The Relay type checker is unable to show the following types match.
In particular `Tensor[(1), bool]` does not match `bool`
tensor type `Tensor[(?, 1, ?, ?), float32]` has 4 dimensions, while `Tensor[(?, ?, ?), float32]` has 3 dimensions
The Relay type checker is unable to show the following types match.
In particular `Tensor[(?, 1, ?, ?), float32]` does not match `Tensor[(?, ?, ?), float32]`
Traceback (most recent call last):
  File "run_onnx_tvm_camera.py", line 122, in <module>
    graph, lib, params = relay.build(mod,
  File "/home/workspacae/installation/TVM/tvm/python/tvm/relay/build_module.py", line 275, in build
    graph_json, mod, params = bld_mod.build(mod, target, target_host, params)
  File "/home/workspacae/installation/TVM/tvm/python/tvm/relay/build_module.py", line 138, in build
    self._build(mod, target, target_host)
  File "/home/workspacae/installation/TVM/tvm/python/tvm/_ffi/_ctypes/packed_func.py", line 237, in __call__
    raise get_last_ffi_error()
tvm.error.DiagnosticError: Traceback (most recent call last):
  [bt] (8) /home/workspacae/installation/TVM/tvm/build/libtvm.so(tvm::relay::backend::RelayBuildModule::Optimize(tvm::IRModule, tvm::Map<tvm::Integer, tvm::Target, void, void> const&, std::unordered_map<std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> >, tvm::runtime::NDArray, std::hash<std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> > >, std::equal_to<std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> > >, std::allocator<std::pair<std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> > const, tvm::runtime::NDArray> > > const&)+0xe86) [0x7f639be2e826]
  [bt] (7) /home/workspacae/installation/TVM/tvm/build/libtvm.so(tvm::transform::Pass::operator()(tvm::IRModule) const+0x67) [0x7f639b35f067]
  [bt] (6) /home/workspacae/installation/TVM/tvm/build/libtvm.so(tvm::transform::SequentialNode::operator()(tvm::IRModule, tvm::transform::PassContext const&) const+0x32f) [0x7f639b46c24f]
  [bt] (5)     /home/workspacae/installation/TVM/tvm/build/libtvm.so(tvm::transform::SequentialNode::operator()(tvm::IRModule, tvm::transform::PassContext const&) const+0x27e) [0x7f639b46c19e]
      [bt] (4) /home/workspacae/installation/TVM/tvm/build/libtvm.so(tvm::transform::ModulePassNode::operator()(tvm::IRModule, tvm::transform::PassContext const&) const+0x1d4) [0x7f639b46f3d4]
      [bt] (3) /home/workspacae/installation/TVM/tvm/build/libtvm.so(+0x120eeb2) [0x7f639be0beb2]
      [bt] (2) /home/workspacae/installation/TVM/tvm/build/libtvm.so(+0x120de37) [0x7f639be0ae37]
      [bt] (1) /home/workspacae/installation/TVM/tvm/build/libtvm.so(tvm::DiagnosticContext::Render()+0x231) [0x7f639b420391]
      [bt] (0) /home/workspacae/installation/TVM/tvm/build/libtvm.so(+0x822f88) [0x7f639b41ff88]
      File "/home/workspacae/installation/TVM/tvm/src/ir/diagnostic.cc", line 105
    DiagnosticError: one or more error diagnostics were emitted, please check diagnostic render for output.

ran into the same issue thanks for asking about this

The freeze_params and relay.transform.DynamicToStatic() help to transfer the dynamic shape to static. mod, params = relay.frontend.from_onnx(onnx_model, shape_dict, freeze_params=True) mod = relay.transform.DynamicToStatic()(mod)

This solves my issue.