Maybe a bug in embedding_bag

why the implementation of embedding_bag in https://github.com/apache/tvm/blob/6eb3a1fc3688bede69efd7f14b313b35497ccf02/python/tvm/relay/frontend/onnx.py#L4025 does not use the value of offfset. And it uses a reshape op, indices = _op.reshape(indices, indices_shape), the indices_shape is[offset_shape, -1], if the shape of indices is [13, ], the shape of offset is[3], how to reshape [13] to [3, -1]? has anyone knows if there is an error in embedding_bag, or there are some pre-processing i didn`t found.

CC @shingjan might be able to help.

I make some examples, comparing with the real value of pytorch. when the value of offset is even distribution,the result will be right,otherwise error.

@zgplvyou Thanks for the context. I believe the embedding_bag impl is intended for 2D offset here. Meaning if pytorch is giving a 1D offset, which is common for aten::embedding_bag, we need to reshape the offsets to 2D based on the offset shape. In your case, what are the models that you are testing that is using an unevenly distributed offsets in its embedding_bag layer?

thank u for the answer. u mean the value of offset usually is evenly distributed if i use the real data? I run the dlrm model from https://github.com/facebookresearch/dlrm, and use fake data. I will download the dataset and check the offsets. anyway, directly reshape the index is not suitable. if the offset is not evenly distributed, the implementation of embedding will be more complex.

@shingjan the values of offset may be unevenly distributed in a 1D offset, if so, the embedding_bag does not need the value of offset, it only needs the lenth of offset. I found unevenly distributed 1D offset in my project and I still haven’t solved this problem, can I report a bug to github?

@zgplvyou Hi, feel free to report this bug/issue to the TVM repo. I wonder if you can give a better context on this unevenly distributed 1d offset problem. A reproducible example will be really helpful in pinning down the issue and see if we need a fix for it. Thanks!