xinche
1
my code:
from tvm.relax.expr_functor import PyExprMutator, mutator
@mutator
class MyPass(PyExprMutator):
def __init__(self, mod = None) -> None:
super().__init__(mod)
self.id = 0
def visit_binding_block(self, block):
print(f" enter visit binding block {self.id}")
self.id += 1
return super().visit_binding_block(block)
and got the result:
ysh329
2
Hi, can u supply complete code to reproduce?
cc @Hzfengsy
xinche
3
hi, the complete code is as follows:
from tvm import relax
from tvm.relax.expr_functor import PyExprMutator, mutator
@mutator
class MyPass(PyExprMutator):
def __init__(self, mod = None) -> None:
super().__init__(mod)
self.id = 0
def visit_binding_block(self, block):
print(f" enter visit binding block {self.id}")
self.id += 1
return super().visit_binding_block(block)
def tmp_mod():
data_shape = [100, 200]
data = relax.Var("scale", relax.TensorStructInfo(data_shape, "float32"))
bb = relax.BlockBuilder()
with bb.function("main", [data]):
with bb.dataflow() as _:
lv0 = relax.op.abs(data)
gv0 = bb.emit_output(lv0)
bb.emit_func_output(gv0)
return bb.get()
if __name__ == "__main__":
mod = tmp_mod()
update_mod = MyPass().visit_expr(mod["main"])
Thanks @xinche for reporting. However, the current design does not support super()
method for calling the default mutator.
cc @tqchen to see if it’s possible to support it as it may influence the UX when writing pass in python
xinche
5
Thank you for your answer. I have temporarily resolved my issue using a workaround. Look forward to future updates for pass in python.