Want help adding Batch_Matmul op for tenforflow frontend

https://github.com/dmlc/tvm/issues/3414

I am trying to add batch_mamtul op for tensorflow frontend. I edited the files below: python/tvm/relay/frontend/tensorflow.py,
include/tvm/relay/attrs/nn.h, python/tvm/relay/op/nn/_nn.py, python/tvm/relay/op/nn/nn.py, python/tvm/relay/op/op_attrs.py,
src/relay/op/nn/nn.cc,
topi/include/topi/nn/batch_matmul.h, topi/python/topi/generic/nn.py, topi/python/topi/nn/batch_matmul.py, topi/src/topi.cc

Then I built the project and it compiled without error. And when I called

sym, params = relay.frontend.from_tensorflow(my_graph_def, shape=my_shape)

it worked fine. But the next line is:

with relay.build_config(opt_level=3):
    graph, lib, params = relay.build(sym, target='llvm', target_host='llvm', params=params)

And here I got a bug,

tvm/python/tvm/_ffi/_ctypes/function.py raises get_last_ffi_error, it also raises TypeError: batch_matmul() argument after ** must be a mapping, not NoneType, saying

tvm/python/tvm/relay/op/nn/_nn.py", line 75, in compute_batch_matmul return [topi.nn.batch_matmul(inputs[0],inputs[1],**attrs)]

I need some help …

@Shawhey, not sure why you modified topi and relay op, is it to add broadcasting?

seems there is no need to touch topi and relay op because batch_matmul was added in topi and relay a few months ago. To support tf BatchMatMul, I think we should do is to add the support in python/tvm/relay/frontend/tensorflow.py.