Got error when Relay VM executes recursive function

The Relay Program is from tests/python/relay/test_pass_lambda_lift.py.

x = relay.var('x', shape=(2,))
i = relay.var('i', shape=(), dtype='int32')
s = relay.var('s', shape=(2,))
cond = i < relay.const(10, dtype='int32')

loop = relay.var('while_loop')
sb = relay.scope_builder.ScopeBuilder()
with sb.if_scope(cond):
    ii = i + relay.const(1, dtype='int32')
    ss = s + x
    sb.ret(loop(ii, ss))
with sb.else_scope():
    sb.ret(s)
func = relay.Function([i, s], sb.get())
ret = relay.Let(loop, func, loop(relay.const(0, dtype='int32'), 
                                 relay.zeros(shape=(2,), dtype='float32')))
mod = relay.Module()
mod["main"] = relay.Function([x], ret)
# mod = transform.LambdaLift()(mod)

I want to use Relay VM to execute above Relay Program.

ex = relay.create_executor("vm", mod=mod, ctx=tvm.cpu(), target="llvm")
result = ex.evaluate()(np.ones((2,), dtype='float32'))

But I encountered the following error.

TVMError: Traceback (most recent call last):
  [bt] (8) /home/ubuntu/tvm/build/libtvm.so(tvm::relay::ExprFunctor<void (tvm::relay::Expr const&)>::VisitExpr(tvm::relay::Expr const&)+0x445) [0x7f4df0e15ca5]
  [bt] (7) /home/ubuntu/tvm/build/libtvm.so(tvm::relay::vm::VMFunctionCompiler::VisitExpr_(tvm::relay::LetNode const*)+0x92) [0x7f4df0f75f92]
  [bt] (6) /home/ubuntu/tvm/build/libtvm.so(tvm::relay::ExprFunctor<void (tvm::relay::Expr const&)>::VisitExpr(tvm::relay::Expr const&)+0x445) [0x7f4df0e15ca5]
  [bt] (5) /home/ubuntu/tvm/build/libtvm.so(tvm::relay::vm::VMFunctionCompiler::VisitExpr_(tvm::relay::LetNode const*)+0x92) [0x7f4df0f75f92]
  [bt] (4) /home/ubuntu/tvm/build/libtvm.so(tvm::relay::ExprFunctor<void (tvm::relay::Expr const&)>::VisitExpr(tvm::relay::Expr const&)+0x445) [0x7f4df0e15ca5]
  [bt] (3) /home/ubuntu/tvm/build/libtvm.so(tvm::relay::vm::VMFunctionCompiler::VisitExpr_(tvm::relay::LetNode const*)+0x3f) [0x7f4df0f75f3f]
  [bt] (2) /home/ubuntu/tvm/build/libtvm.so(tvm::relay::ExprFunctor<void (tvm::relay::Expr const&)>::VisitExpr(tvm::relay::Expr const&)+0x445) [0x7f4df0e15ca5]
  [bt] (1) /home/ubuntu/tvm/build/libtvm.so(tvm::relay::vm::VMFunctionCompiler::VisitExpr_(tvm::relay::CallNode const*)+0xaac) [0x7f4df0f7811c]
  [bt] (0) /home/ubuntu/tvm/build/libtvm.so(dmlc::LogMessageFatal::~LogMessageFatal()+0x32) [0x7f4df08cc352]
  File "/home/ubuntu/tvm/src/relay/backend/vm/compiler.cc", line 635
TVMError: internal error: unreachable code,should be transformed away by previous passesfree_var %x: int32
free_var %x1: Tensor[(2), float32]
free_var %x2: Tensor[(2), float32]
%0 = @lifted_name1574306537384186922(%x2) /* ty=fn (int32, Tensor[(2), float32]) -> Tensor[(2), float32] */;
%0(%x, %x1) /* ty=Tensor[(2), float32] */

Which passes should be applied to avoid the error? When I uncomment mod = transform.LambdaLift()(mod), I got another error.

TVMError: Traceback (most recent call last):
  [bt] (5) /home/ubuntu/tvm/build/libtvm.so(TVMFuncCall+0x61) [0x7fedbb5819a1]
  [bt] (4) /home/ubuntu/tvm/build/libtvm.so(+0x11d86f8) [0x7fedbb44a6f8]
  [bt] (3) /home/ubuntu/tvm/build/libtvm.so(+0x11d85c0) [0x7fedbb44a5c0]
  [bt] (2) /home/ubuntu/tvm/build/libtvm.so(tvm::relay::vm::VMCompiler::Compile(tvm::relay::Module, tvm::Map<tvm::Integer, tvm::Target, void, void> const&, tvm::Target const&)+0xe8b) [0x7fedbb448a8b]
  [bt] (1) /home/ubuntu/tvm/build/libtvm.so(tvm::relay::Function tvm::runtime::Downcast<tvm::relay::Function, tvm::relay::Expr>(tvm::relay::Expr)+0x198) [0x7fedbb2d8958]
  [bt] (0) /home/ubuntu/tvm/build/libtvm.so(dmlc::LogMessageFatal::~LogMessageFatal()+0x32) [0x7fedbada8352]
  File "/home/ubuntu/tvm/include/tvm/runtime/object.h", line 806
TVMError: Check failed: ref->template IsInstance<typename SubRef: :ContainerType>(): Downcast from relay.Let to relay.Function failed.

Can someone help me? Thanks

I took a quick look. It seems that the cause is a combination of multiple issues. First, we need to move lambda lift pass before ToANF pass because it’s no longer ANF after lambda pass in this example. Second, I think memory pass might lack the support for closure.

Need to spend more time to look into this.

also cc @jroesch