[BYOC] PyTorch upsample_bilinear2D not represented correctly in Relay

I have two interpolate ops defined in PyTorch that take a (10, 10, 128) tensor and convert it to a (20, 20, 128) op as follows: (10,10,128) → (10, 20, 128) → (20, 20, 128).

The corresponding Relay ops are incorrectly interpreted as: (10, 10, 128) → (10, 10, 128) → (19, 19, 128).

Looking at the PyTorch frontend, I see a function invoked to grab the dimensions for the Relay img_resize op:

def get_upsample_out_size(self, inputs, method):

    # This assumes a static shape
    out_size = []
    if inputs[1] is not None:
        for size in inputs[1]:
            if not isinstance(size, int):
                out_size.append(int(_infer_value(size, {}).numpy()))
            else:
                out_size.append(size)

I see (10, 128) and (19, 128) picked up as the sizes here.

I have verified the 2D resizes behave properly in the PyTorch model. Why is the Relay op not copying the sizes here properly?

Or should these 2D resizes actually be written as 1D resizes in Relay?

Tagging @masahi for PyTorch frontend expertise.

Are you passing the output size or scale for torch.nn.functional.interpolate? It seems the output size passed to TVM is recognized as 19. Please dig more into how this size is supposed to be calculated in PT.

Hi @masahi

Here’s the relevant code in PT.

H_x = 10, W = 20, H = 20

    x = torch.ao.nn.quantized.functional.interpolate(input=x, size=(H_x, W), mode='bilinear', align_corners=True)


    print("X shape after first resize: ", x.shape)

    x = torch.ao.nn.quantized.functional.interpolate(input=x, size=(H, W), mode='bilinear', align_corners=True)

    print("X shape after second resize: ", x.shape)

And here’s the output:

X shape after first resize: torch.Size([1, 256, 10, 20])

X shape after second resize: torch.Size([1, 256, 20, 20])

There is no size passed as 19–would align_corners=true be setting it to 19 instead of 20 in Relay? I am wondering if these should be 1D resize ops instead because I am only doubling one dimension. On the other hand you are correct that TVM itself is being passed 19, not sure if that points to an issue in PT, not TVM.