Relay Can't Merge 2 Constant "Add" Operator?

I have found below Relay IR when doing some job with ResNet-50, we can see the 2 add operators are can be merged to 1 add, the below log is build with opt_level=3.

  %21 = nn.conv2d(%20, meta[relay.Constant][2] /* ty=Tensor[(32, 8, 1, 1, 8, 8), int8] */, padding=[0, 0, 0, 0], channels=256, kernel_size=[1, 1], data_layout="NCHW8c", kernel_layout="OIHW8i8o", out_layout="NCHW8c", out_dtype="int32") /* ty=Tensor[(1, 32, 56, 56, 8), int32] */;
  %22 = add(%21, meta[relay.Constant][3] /* ty=Tensor[(1, 32, 1, 1, 8), int32] */) /* ty=Tensor[(1, 32, 56, 56, 8), int32] */;
  %23 = add(%22, 32 /* ty=int32 */) /* ty=Tensor[(1, 32, 56, 56, 8), int32] */;
  %24 = right_shift(%23, 6 /* ty=int32 */) /* ty=Tensor[(1, 32, 56, 56, 8), int32] */;
  %25 = clip(%24, a_min=-127f, a_max=127f) /* ty=Tensor[(1, 32, 56, 56, 8), int32] */;
  %26 = cast(%25, dtype="int8") /* ty=Tensor[(1, 32, 56, 56, 8), int8] */;

Anyone who know whether Relay can’t support these optimization now?

From the above test case, it seems the pass transform.FoldConstant can’t support these optimization, anybody know why we won’t implement these optimization, does these optimization shouldn’t be done? or we just don’t have time to do them? Thanks.

2 Likes

I think that looks like a bug, perhaps file an issue?