Hi, I’m importing Pytorch SRGAN model (trained by https://github.com/eriklindernoren/PyTorch-GAN/tree/master/implementations/srgan). The error occurred:
Traceback (most recent call last):
File "/home/incubator-tvm/tutorials/test_pytorch/srgan/srgan_clear_pytorch.py", line 114, in <module>
mod, params = relay.frontend.from_pytorch(scripted_model, shape_dict)
File "/home/incubator-tvm/python/tvm/relay/frontend/pytorch.py", line 2650, in from_pytorch
outputs, ret_name, convert_map, prelude)
File "/home/incubator-tvm/python/tvm/relay/frontend/pytorch.py", line 2559, in convert_operators
relay_out = relay_op(inputs, _get_input_types(op_node))
File "/home/incubator-tvm/python/tvm/relay/frontend/pytorch.py", line 834, in _impl
channels = _infer_shape(data)
File "/home/incubator-tvm/python/tvm/relay/frontend/common.py", line 486, in infer_shape
out_type = infer_type(inputs, mod=mod)
File "/home/incubator-tvm/python/tvm/relay/frontend/common.py", line 465, in infer_type
new_mod = IRModule.from_expr(node)
File "/home/incubator-tvm/python/tvm/ir/module.py", line 222, in from_expr
return _ffi_api.Module_FromExpr(expr, funcs, defs)
File "/home/incubator-tvm/python/tvm/_ffi/_ctypes/packed_func.py", line 225, in __call__
raise get_last_ffi_error()
tvm._ffi.base.TVMError: Traceback (most recent call last):
[bt] (8) /home/incubator-tvm/build/libtvm.so(TVMFuncCall+0x65) [0x7fc13bd5f545]
[bt] (7) /home/incubator-tvm/build/libtvm.so(+0xc334e3) [0x7fc13b4014e3]
[bt] (6) /home/incubator-tvm/build/libtvm.so(tvm::IRModule::FromExpr(tvm::RelayExpr const&, tvm::Map<tvm::GlobalVar, tvm::BaseFunc, void, void> const&, tvm::Map<tvm::GlobalTypeVar, tvm::TypeData, void, void> const&)+0x28a) [0x7fc13b3feb2a]
[bt] (5) /home/incubator-tvm/build/libtvm.so(tvm::IRModuleNode::Add(tvm::GlobalVar const&, tvm::BaseFunc const&, bool)+0x3ef) [0x7fc13b3fc66f]
[bt] (4) /home/incubator-tvm/build/libtvm.so(tvm::RunTypeCheck(tvm::IRModule const&, tvm::GlobalVar const&, tvm::relay::Function)+0x246) [0x7fc13b3fa6d6]
[bt] (3) /home/incubator-tvm/build/libtvm.so(tvm::relay::InferType(tvm::relay::Function const&, tvm::IRModule const&, tvm::GlobalVar const&)+0x1bc) [0x7fc13bb963bc]
[bt] (2) /home/incubator-tvm/build/libtvm.so(tvm::relay::TypeInferencer::Infer(tvm::RelayExpr)+0x71) [0x7fc13bb95af1]
[bt] (1) /home/incubator-tvm/build/libtvm.so(tvm::ErrorReporter::RenderErrors(tvm::IRModule const&, bool)+0x228b) [0x7fc13b3e623b]
[bt] (0) /home/incubator-tvm/build/libtvm.so(dmlc::LogMessageFatal::~LogMessageFatal()+0x82) [0x7fc13b3b2992]
File "/home/incubator-tvm/src/ir/error.cc", line 132
TVMError:
Error(s) have occurred. The program has been annotated with them:
In `main`:
v0.0.4
fn (%input.1: Tensor[(1, 3, 64, 64), float32], %v17: Tensor[(64, 3, 9, 9), float32], %v16: Tensor[(64), float32], %v36: Tensor[(1), float32], %v61: Tensor[(64, 64, 3, 3), float32], %v60: Tensor[(64), float32]) {
%0 = nn.conv2d(%input.1, %v17, padding=[4, 4, 4, 4], channels=64, kernel_size=[9, 9]);
%1 = nn.bias_add(%0, %v16);
**%2 = nn.prelu(%1, %v36) in particular dimension 0 conflicts 64 does not match 1; unable to unify: `Tensor[(64), float32]` and `Tensor[(1), float32]`; ;**
%3 = nn.conv2d(%2, %v61, padding=[1, 1, 1, 1], channels=64, kernel_size=[3, 3]);
nn.bias_add(%3, %v60)
}
What does the mistake mean? “in particular dimension 0 conflicts 64 does not match 1”