"Optional" op for TVM pattern matching on C++ side

Dear all,

I came across the “optional” op being used in pattern matching on the python side as follows:

I am attempting pattern matching on the C++ side of TVM and am in need of the optional op on the C++ side.

@mbrookhart @mbs-octoml @masahi

Hi, it looks like that just unfolds to iterated AltPatterns (bound to | in python):

  pattern = is_op(nn.conv2d)(*, *)
  pattern = pattern | is_op(nn.relu)(pattern)
  pattern = pattern | is_op(tanh)(pattern)

You can mimic that on the C++ side, or send a PR to add a helper in dataflow_pattern.{h,cc}?

I am obscenely behind, because I manage 14 people at OctoML. :smiley:

But hey, a rare couple of hours of free time popped up this morning.

Hey @mbrookhart ! Thanks a lot for this patch.

Hi all, I tried this patch for optional pattern in C++.

I am trying out this by specifying this simple pattern to match: bias_add(conv2d, biasval) : where bias_add is optional

I am doing the following to apply optional pattern:

x_ = IsWildcard();

const1_ = IsWildcard();

const2_ = IsWildcard();

const3_ = IsWildcard();

const4_ = IsWildcard();

auto conv2d = IsOp(“nn.conv2d”);

auto biasadd = IsOp(“nn.bias_add”);

auto relu = IsOp(“nn.relu”);

DFPattern pattern1 = conv2d({x_, const1_});

DFPattern pattern2 = pattern1.Optional(IsOp(“nn.bias_add”)({pattern1, const2_}));

pattern_ = pattern2;

I get this error:

/local/mnt/workspace/aakaverm/experiment_tvm/tvm/src/relay/transforms/aimetcls.cc:55:44: error: no viable conversion from ‘tvm::relay::DFPattern’ to ‘const std::function<DFPattern (const DFPattern &)>’ DFPattern pattern2 = pattern1.Optional(IsOp(“nn.bias_add”)({pattern1, const2_}));

Not sure how to apply optional op. I referred this file: https://github.com/apache/tvm/blob/main/tests/cpp/dataflow_pattern_test.cc#L152

I see Optional syntax in line 152 but not able to understand it. Could you please briefly explain what do we need to give as input to the Optional.

I could understand that Optional accepts a callable, but does this callable need to be defined for each op based on the op functionality? By the test case it looks like we need to define the op functionality that we want to be optional??

cc: @mbrookhart, @mbs-octoml, @masahi

Thank you.

The usage should be similar to the corresponding Python API. See for example https://github.com/apache/tvm/blob/8a0472fd3e3138ff17391b449cc02de3ab2c4207/python/tvm/relay/op/contrib/arm_compute_lib.py#L144-L145

For example, I’d try pattern1.Optional([=](const DFPattern& x) { return x + const2_; }

1 Like

Hi @masahi , I was wondering if I can use it this way. Basically, I wish to capture the op functionality inside the lambda .

I see the example you gave where you defined the functionality in the return statement as basic expression but if I simply want to use the functionality of an op .. like relu, etc. then I wish to write return as I tried below for bias_add.

The build going through successfully, but not sure if this return is causing the error when I apply this pass to the model.

auto conv2d = IsOp("nn.conv2d");
auto biasadd = IsOp("nn.bias_add");
auto relu = IsOp("nn.relu");
DFPattern pattern1 = conv2d({x_, const1_});
DFPattern pattern2 = pattern1.Optional([this, biasadd](const DFPattern& x) {return biasadd({x, const2_});});

mod = relay.transform.PatternMatchRewrite()(mod) File “/local/mnt/workspace/aakaverm/experiment_tvm/tvm/python/tvm/ir/transform.py”, line 161, in call return _ffi_transform_api.RunPass(self, mod) File “/local/mnt/workspace/aakaverm/experiment_tvm/tvm/python/tvm/_ffi/_ctypes/packed_func.py”, line 237, in call raise get_last_ffi_error() tvm._ffi.base.TVMError: Traceback (most recent call last): 24: TVMFuncCall 23: tvm::runtime::PackedFuncObj::Extractor<tvm::runtime::PackedFuncSubObj<void tvm::runtime::TypedPackedFunc<tvm::IRModule (tvm::transform::Pass, tvm::IRModule)>::AssignTypedLambdatvm::transform::$_6(tvm::transform::$_6, std::__1::basic_string<char, std::__1::char_traits, std::__1::allocator >)::‘lambda’(tvm::runtime::TVMArgs const&, tvm::runtime::TVMRetValue*)> >::Call(tvm::runtime::PackedFuncObj const*, tvm::runtime::TVMArgs, tvm::runtime::TVMRetValue*) 22: tvm::transform::Pass::operator()(tvm::IRModule) const 21: tvm::transform::Pass::operator()(tvm::IRModule, tvm::transform::PassContext const&) const 20: tvm::relay::transform::FunctionPassNode::operator()(tvm::IRModule, tvm::transform::PassContext const&) const 19: tvm::runtime::PackedFuncObj::Extractor<tvm::runtime::PackedFuncSubObj<void tvm::runtime::TypedPackedFunc<tvm::relay::Function (tvm::relay::Function, tvm::IRModule, tvm::transform::PassContext)>::AssignTypedLambdatvm::relay::transform::AimetCrossLayerEqualize()::$_0(tvm::relay::transform::AimetCrossLayerEqualize()::$_0)::‘lambda’(tvm::runtime::TVMArgs const&, tvm::runtime::TVMRetValue*)> >::Call(tvm::runtime::PackedFuncObj const*, tvm::runtime::TVMArgs, tvm::runtime::TVMRetValue*) 18: tvm::relay::AimetCrossLayerEqualize(tvm::RelayExpr const&, tvm::IRModule const&) 17: tvm::relay::RewritePatterns(tvm::runtime::Array<tvm::relay::DFPatternCallback, void>, tvm::RelayExpr, tvm::IRModule) 16: tvm::relay::PatternRewriter::Rewrite(tvm::runtime::Array<tvm::relay::DFPatternCallback, void> const&, tvm::RelayExpr const&) 15: tvm::relay::MixedModeMutator::VisitExpr(tvm::RelayExpr const&) 14: tvm::relay::MixedModeMutator::VisitLeaf(tvm::RelayExpr const&) 13: tvm::relay::PatternRewriter::DispatchVisitExpr(tvm::RelayExpr const&) 12: _ZN3tvm5relay1 11: tvm::relay::ExprMutator::VisitExpr(tvm::RelayExpr const&) 10: tvm::relay::ExprFunctor<tvm::RelayExpr (tvm::RelayExpr const&)>::VisitExpr(tvm::RelayExpr const&) 9: tvm::NodeFunctor<tvm::RelayExpr (tvm::runtime::ObjectRef const&, tvm::relay::ExprFunctor<tvm::RelayExpr (tvm::RelayExpr const&)>)>::operator()(tvm::runtime::ObjectRef const&, tvm::relay::ExprFunctor<tvm::RelayExpr (tvm::RelayExpr const&)>) const 8: ZZN3tvm5relay11ExprFunc 7: tvm::relay::ExprMutator::VisitExpr(tvm::relay::FunctionNode const*) 6: tvm::relay::MixedModeMutator::VisitExpr(tvm::RelayExpr const&) 5: tvm::relay::MixedModeMutator::VisitLeaf(tvm::RelayExpr const&) 4: tvm::relay::PatternRewriter::DispatchVisitExpr(tvm::RelayExpr const&) 3: tvm::relay::DFPatternRewrite::MakeCallback() const::‘lambda’(tvm::runtime::TVMArgs, tvm::runtime::TVMRetValue*)::operator()(tvm::runtime::TVMArgs, tvm::runtime::TVMRetValue*) const 2: tvm::relay::AimetCrossLayerScaling::Callback(tvm::RelayExpr const&, tvm::RelayExpr const&, tvm::runtime::Map<tvm::relay::DFPattern, tvm::runtime::Array<tvm::RelayExpr, void>, void, void> const&) const 1: tvm::runtime::Map<tvm::relay::DFPattern, tvm::runtime::Array<tvm::RelayExpr, void>, void, void>::at(tvm::relay::DFPattern const&) const 0: tvm::runtime::DenseMapNode::At(tvm::runtime::ObjectRef const&) const File “/local/mnt/workspace/aakaverm/experiment_tvm/tvm/include/tvm/runtime/container/map.h”, line 644 TVMError:


An error occurred during the execution of TVM. For more information, please see: Handle TVM Errors — tvm 0.21.dev0 documentation


Check failed: (!iter.IsNone()) is false: IndexError: key is not in Map

This looks like maybe your callback is doing something wrong in the cas the optional pattern isn’t hot? Wild guess would be you need to check for the existence of the pattern before doing the query