When I am using bitpack op in a model, sometimes it gets fused with cast op. above, like (other_ops+cast+bitpack) and sometimes it doesn’t added to the fused group such that (other_ops+cast), (bitpack).
I did not get how the tvm operator fusion works in this scenario? any help.
OUT_ELEMWISE_FUSABLE seems could only fuse with ELEMWISE op.
cast is kElemWise. add kBroadcast, but if add inputs shape and outputs shape is same, add will become ElemWise. so you can see bitpack not fused with before 2.
relu kElemWise.
after adding the stop_fusion node before cast op, it is transformed to:
cast+bitpack
custom-op
cast+add+multiply+add+relu
cast+bitpack
custom-op
cast+multiply+add+relu
cast+bitpack
The issue before inserting stop_fusion is that the bitpack op doesn’t get fused with the cast. But after adding, the custom-op became the single op without getting fused.
I am looking to transform the original graph to:
add+relu+cast+bitpack
custom-op+cast+add+multiply+add+relu+cast+bitpack
custom-op+cast+multiply+add+relu+cast+bitpack
But issue is, two op’s with above kBroadcast level can’t be fused:
if (lhs > kBroadcast && rhs > kBroadcast) {
LOG(FATAL) << "Cannot merge two complex group together";
}
Any solution to achieve the above transformation? thanks
I want to fuse the two ops of kind Out_Elementwise_Fusable and Injective. But existing fuse_ops.cc does not allow.
If we see bitserial convolution, bitpack is injective and convolution is out_elementwise_fusable and both ops work as a single op in bitserial_conv2d but If we fuse the two ops (i.e nn.bitpack and custom_op_conv2d) explicitly, it throws an error as “Cannot merge two complex group together”?
I didn’t get how the schedule differs from the individual ops to the ops if they fused.