[torch] zeros - strange output

TVM output for torch.zeros(a.shape) is strange - it returns input tensor as-is instead of returning zeros tensor. Example:

import torch
import tvm
from tvm import relay
        
class Net(torch.nn.Module):
    def forward(self, a):
        y = torch.zeros(a.shape)
        return y
        

net = Net()
a = torch.tensor([6,6])
net(a)

traced_net = torch.jit.trace(net, (a))

shape_list = [("input0", [2]),]
mod, params = relay.frontend.from_pytorch(traced_net, shape_list)

ctx = tvm.cpu(0)
target = 'llvm'
with tvm.transform.PassContext(opt_level=3):
    lib = relay.build(mod, target=target, params=params)

func=mod['main']
intrp = relay.create_executor("graph", ctx=ctx, target=target)
ff=intrp.evaluate(func)
ff([6,6])

Output

# Correct one:
tensor([0., 0.])

# TVM output:
<tvm.nd.NDArray shape=(2,), cpu(0)>
array([6., 6.], dtype=float32)

BTW, torch.ones has the same issue.

@siju-samuel, @t-vi, @masahi What you think?

Interesting, this is weird. This is the mod

def @main() {
  full(0, shape=[2], dtype="float32")
}

It doesn’t make any sense to return something other than zeros.