Index_put operator in Relay

What operator in Relay can be used to set multiple values to a given tensor?

Lets say I have 3x3 tensor “A” and I need to set

A[0,0] = 2.0
A[1,1] = 4.0
A[2,1] = 7.0
A[2,2] = 9.0

PyTorch code:

A = torch.zeros(3,3)
hs = torch.tensor([0, 1, 2, 2])
ws = torch.tensor([0, 1, 1, 2])
vs = torch.tensor([2.0, 4.0, 7.0, 9.0])
A.index_put(indices=[hs, ws], values = vs)

tensor([[2., 0., 0.],
        [0., 4., 0.],
        [0., 7., 9.]])

Probably scatter_nd, see [RELAY,TOPI] Add scatter_nd op by tkonolige · Pull Request #6854 · apache/tvm · GitHub

Note that Relay is a functional language, so modify tensors in place is awkward. In this case, the tensor you are modifying has zeros by default, so you can use scatter_nd.

Actual torchscript from some model looks the following

torchscript Python:

output = torch.zeros([_17, int(num_channels), 7, 7], dtype=6, layout=None, pin_memory=False)
output0 = torch.index_put_(output, _20, _19, False)

torchscript IR:

%output.1 : Float(1000, 256, 7, 7, strides=[12544, 49, 7, 1], requires_grad=0) = aten::zeros(%1990, %1942, %1945, %1943, %1944)
%output.2 : Float(1000, 256, 7, 7, strides=[12544, 49, 7, 1], requires_grad=0) = aten::index_put_(%output.1, %2000, %1999, %1944)

scatter_nd might work if we assume that input_put_ usually sets the values to empty tensor.

ctx = tvm.cpu(0)
target = 'llvm'
dtype = 'float32'
shape = (3,3)
tp = relay.TensorType(shape)
data = np.random.rand(*shape).astype(dtype)

hs = [0, 1, 2, 2]
ws = [0, 1, 1, 2]
vs = [2.0, 4.0, 7.0, 9.0]
indices = tvm.nd.array([hs, ws], ctx)
values = tvm.nd.array(np.array(vs).astype(dtype), ctx)

indices_re = relay.Constant(indices)
values_re = relay.Constant(values)
y = relay.scatter_nd(values_re, indices_re, shape)

x = relay.var("x", tp, dtype=dtype)

func = relay.Function([x], y)
intrp = relay.create_executor("graph", ctx=ctx, target=target)
intrp.evaluate(func)(data)

<tvm.nd.NDArray shape=(3, 3), cpu(0)>
array([[2., 0., 0.],
       [0., 4., 0.],
       [0., 7., 9.]], dtype=float32)

I faced an issue with scatter_nd and concatenate Scatter_nd and concatenate fails

PR to add torch.index_put operator to TVM PyTorch frontend