Do we still need relay.nn.contrib_conv2d_NCHWc op?

Anyone knows why we have relay.nn.contrib_conv2d_NCHWc op, while there is no op specific for other layouts such as NHWC? Since layout_transform op is inserted and precomputed if possible automatically, I don’t see a need for this op.

The only difference between the regular conv2d op is that it is using a specific type relation to bypass the shape checking on weight:

The same question applies to the dense_prepack op introduced relatively recently in https://github.com/apache/tvm/pull/7404. If we add weight_layout attribute to the dense op and update its type relation accordingly, I don’t see a need for introducing the new op.

@yzhliu @vinx13 @comaniac

More data point:

If the weight-layout transform cannot be expressed by relay.layout_transform op, I can see the reason why we need a layout-specific op. But I’d rather update relay.layout_transform to support more transform types and not to introduce another ops.