[TVM Quantization] Fails for specific OP

Hello Community. I implemented the TF Resampler OP. Since the logic is very similar to the already implemented “Upsampling” op, i basically just mirrored all the Upsampling logic onto the Resampler logic while tweaking necessarry positions. Classic F32 execution of a dummy model works. Trying to apply quantization on the TVM level to a dummy model containing only my custom resampler OP yields the following error:

  [bt] (8) tvm_path/build/libtvm.so(+0x28ae0fb) [0x7f39bf1d80fb]
  [bt] (7) tvm_path/build/libtvm.so(tvm::relay::MixedModeVisitor::VisitLeaf(tvm::RelayExpr const&)+0x72) [0x7f39bf1d7f90]
  [bt] (6) tvm_path/build/libtvm.so(tvm::relay::ExprFunctor<void (tvm::RelayExpr const&)>::VisitExpr(tvm::RelayExpr const&)+0x12e) [0x7f39be62ba2c]
  [bt] (5) tvm_path/build/libtvm.so(tvm::NodeFunctor<void (tvm::runtime::ObjectRef const&, tvm::relay::ExprFunctor<void (tvm::RelayExpr const&)>*)>::operator()(tvm::runtime::ObjectRef const&, tvm::relay::ExprFunctor<void (tvm::RelayExpr const&)>*) const+0x140) [0x7f39be62ca58]
  [bt] (4) tvm_path/build/libtvm.so(tvm::relay::ExprFunctor<void (tvm::RelayExpr const&)>::InitVTable()::{lambda(tvm::runtime::ObjectRef const&, tvm::relay::ExprFunctor<void (tvm::RelayExpr const&)>*)#14}::_FUN(tvm::runtime::ObjectRef const&, tvm::relay::ExprFunctor<void (tvm::RelayExpr const&)>*)+0x29) [0x7f39be62c58e]
  [bt] (3) tvm_path/build/libtvm.so(tvm::relay::ExprFunctor<void (tvm::RelayExpr const&)>::InitVTable()::{lambda(tvm::runtime::ObjectRef const&, tvm::relay::ExprFunctor<void (tvm::RelayExpr const&)>*)#14}::operator()(tvm::runtime::ObjectRef const&, tvm::relay::ExprFunctor<void (tvm::RelayExpr const&)>*) const+0x42) [0x7f39be62c55e]
  [bt] (2) tvm_path/build/libtvm.so(tvm::relay::TypeVarEVisitor::VisitExpr_(tvm::ConstructorNode const*)+0x51) [0x7f39bef0bbd1]
  [bt] (1) tvm_path/build/libtvm.so(tvm::IRModuleNode::LookupTypeDef(tvm::GlobalTypeVar const&) const+0x116) [0x7f39be51eddc]
  [bt] (0) tvm_path/build/libtvm.so(dmlc::LogMessageFatal::~LogMessageFatal()+0x52) [0x7f39be1fb152]
  File "tvm_path/src/ir/module.cc", line 272
TVMError: 
---------------------------------------------------------------
An internal invariant was violated during the execution of TVM.
Please read TVM's error reporting guidelines.
More details can be found here: https://discuss.tvm.ai/t/error-reporting/7793.
---------------------------------------------------------------
  Check failed: it != type_definitions.end() == false: There is no definition of List

The same error occurs when trying to quantize a dummy model containing only the Upsampling OP from Tensorflow (which was not developed my be). Therefore there could be an error on the TVM stack regarding the upsampling op, right? Any suggestions how to investigate this?

Best regards, Knight3

@masahi do you have any insights regarding this problem?

I don’t know about TF frontend, but that error can be usually fixed if you do infer_type with prelude module. The hint is List type exists only in the prelude module.

PyTorch frontend has exactly such function for that purpose.

Thanks for you quick reply! What you say about prelude makes sense, but im not sure if this is necessary in my use-case, since the model im importing is still in F32. I only apply PTQ on the already built IR Module. And if im wrong and i actually need to infer the type with prelude, this is not as straight forward for me, since the Tensorflow frontend file in TVM does not use a function like this as far as i see. Do you know someone, who has worked more intensively with Tensorflow, who might now more about that problem?

@kevinthesun might be able to help