Is this situation supported by relay.build: 2 Ops have their own schedules registered within 1 relay Function

Hi, I run into the issue when dealing with function like this:

fn (%x: Tensor[(64, 64), int8], %w: Tensor[(64), int8], %b: Tensor[(64), int8], %y: Tensor[(64, 64), int8]) {
  %0 = qnn.layer_norm(%x, %w, %b, epsilon=1.52588e-05f, ln_shift=7, affine_shift=8);
  nn.dense(%0, %y, units=64, out_dtype="int8")
}

error msg:

Check failed: (!anchor_op_.defined() || anchor_op_pattern_ < kCommReduce) is false: Cannot apply TOPI schedule to a primitive function with two complicated ops anchor=Op(qnn.layer_norm) current=Op(nn.dense)

qnn.layer_norm and nn.dense both have their own schedule strategy registered for a 3rd target, just using relay.build rather than AutoTVM, and just want to use their own schedules corresponding to each of OPs rather than fuse them. Could this be achieved?

Could anyone help and what should I do for next step? Thanks very much.

1 Like

I updated the TOpPattern of qnn.layer_norm form kOutEWiseFusable to kElemWise, it could pass the checking then, but seems that just schedule of nn.dense is launched, and skipped the schedule of qnn.layer_norm.

The code after checking looks that anchor_op’s information have been updated by that of current op.

if (op_pattern >= anchor_op_pattern_) {
    anchor_op_ = op;
    anchor_attrs_ = call_node->attrs;
    anchor_op_pattern_ = op_pattern;
    anchor_implementation_ = impl;
  }