Errors while compiling PyTorch scriptmodule to TVM/relay

Hi All, I am fairly new to TVM. I am seeing the following error when parsing PyTorch to TVM. Could anyone please help?


ERROR STACK TRACE:
------------------------------
Traceback (most recent call last):
...
  File "~/tvm-1/python/tvm/relay/frontend/pytorch.py", line 2193, in from_pytorch
    outputs, ret_name, convert_map, prelude)
 
  File "~/tvm-1/python/tvm/relay/frontend/pytorch.py", line 2086, in convert_operators
    outputs[node_name] = _expr.Tuple(inputs)
 
  File "~/tvm-1/python/tvm/relay/expr.py", line 186, in __init__
    self.__init_handle_by_constructor__(_ffi_api.Tuple, fields)
 
  File "~/tvm-1/python/tvm/_ffi/_ctypes/object.py", line 95, in __init_handle_by_constructor__
    handle = __init_by_constructor__(fconstructor, args)
 
  File "~/tvm-1/python/tvm/_ffi/_ctypes/packed_func.py", line 231, in __init_handle_by_constructor__
    raise get_last_ffi_error()
 
tvm._ffi.base.TVMError: Traceback (most recent call last):
  [bt] (8) 9   libtvm.dylib                        0x0000000128609b75 void tvm::runtime::TypedPackedFunc<tvm::relay::Tuple (tvm::Array<tvm::RelayExpr, void>)>::AssignTypedLambda<tvm::relay::$_4>(tvm::relay::$_4)::'lambda'(tvm::runtime::TVMArgs const&, tvm::runtime::TVMRetValue*)::operator()(tvm::runtime::TVMArgs const&, tvm::runtime::TVMRetValue*) const + 37
  [bt] (7) 8   libtvm.dylib                        0x0000000128609ba5 void tvm::runtime::detail::unpack_call<tvm::relay::Tuple, 1, tvm::relay::$_4>(tvm::relay::$_4 const&, tvm::runtime::TVMArgs const&, tvm::runtime::TVMRetValue*) + 37
  [bt] (6) 7   libtvm.dylib                        0x0000000128609c20 void tvm::runtime::detail::unpack_call_dispatcher<tvm::relay::Tuple, 1, 0, tvm::relay::$_4>::run<>(tvm::relay::$_4 const&, tvm::runtime::TVMArgs const&, tvm::runtime::TVMRetValue*) + 112
  [bt] (5) 6   libtvm.dylib                        0x0000000128609c6f void tvm::runtime::detail::unpack_call_dispatcher<tvm::relay::Tuple, 0, 1, tvm::relay::$_4>::run<tvm::runtime::TVMMovableArgValue_>(tvm::relay::$_4 const&, tvm::runtime::TVMArgs const&, tvm::runtime::TVMRetValue*, tvm::runtime::TVMMovableArgValue_&&) + 63
  [bt] (4) 5   libtvm.dylib                        0x0000000127b4749d tvm::runtime::TVMMovableArgValue_::operator tvm::Array<tvm::RelayExpr, void><tvm::Array<tvm::RelayExpr, void>, void>() const + 173
  [bt] (3) 4   libtvm.dylib                        0x0000000127b47f6c tvm::runtime::PackedFuncValueConverter<tvm::Array<tvm::RelayExpr, void> >::From(tvm::runtime::TVMArgValue const&) + 28
  [bt] (2) 3   libtvm.dylib                        0x0000000127b48213 tvm::Array<tvm::RelayExpr, void> tvm::runtime::TVMPODValue_::AsObjectRef<tvm::Array<tvm::RelayExpr, void> >() const + 547
  [bt] (1) 2   libtvm.dylib                        0x0000000126efb1e5 dmlc::LogMessageFatal::~LogMessageFatal() + 21
  [bt] (0) 1   libtvm.dylib                        0x0000000126efe5d3 dmlc::LogMessageFatal::~LogMessageFatal() + 67
  File "~/tvm-1/include/tvm/runtime/packed_func.h", line 1429

Original graph:

Graph is graph(%self.1 ,
      %list.1 : Tensor[]):
  %23 : bool = prim::Constant[value=1]() 
  %21 : float = prim::Constant[value=0.10000000000000001]() 
  %22 : float = prim::Constant[value=1.0000000000000001e-05]()
  %26 : float = prim::Constant[value=0.]() 
  %27 : float = prim::Constant[value=6.]()
  %_0_23_1.1 : __torch__.torch.nn.modules.batchnorm.BatchNorm2d = prim::GetAttr[name="0_23_1"](%self.1)
  %5 : Tensor, %6 : Tensor = prim::ListUnpack(%list.1)
  %8 : bool = aten::Bool(%6)
  %10 : Tensor = prim::GetAttr[name="running_mean"](%_0_23_1.1)
  %12 : Tensor = prim::GetAttr[name="running_var"](%_0_23_1.1)
  %15 : Tensor = prim::GetAttr[name="weight"](%_0_23_1.1)
  %17 : Tensor = prim::GetAttr[name="bias"](%_0_23_1.1)
  %input.1 : Tensor = aten::batch_norm(%5, %15, %17, %10, %12, %8, %21, %22, %23) 
  %result.1 : Tensor = aten::hardtanh_(%input.1, %26, %27) 
  %30 : Tensor[] = prim::ListConstruct(%result.1)
  %31 : (Tensor[]) = prim::TupleConstruct(%30)
  return (%31)

Can you post your python code? It seems you are creating a tuple of list, which our frontend complains.

@masahi Thanks for your response. I will try and provide you with the simple python script to encounter this error. Just out of curiosity, isn’t a tuple(tensorlist) an expected type? I think this error comes in becomes tensorlist outputs are inferred as Array instead of List(relay.Expr).

The error is happening because ListConstruct in this case is returning a python list and TupleConstruct will try to wrap relay.Tuple on it. Mixing python values and Relay construct is not possible.

Most likely your code needs to be modified to remove unnecessary list and tuple creation at the end (since you are essentially returning just one tensor).