[Relay][Pass] Feature check fail during fold_constant pass

I encountered an issue when calling fold_constant pass. This check failed:

CHECK(f.is_subset_of(FeatureSet::All() - fGraph));

when creating interpreter. https://github.com/apache/incubator-tvm/blob/master/src/relay/backend/interpreter.cc#L776

I printed the relay expr which causes this problem:

CallNode(FunctionNode([Var(p0, ty=TupleTypeNode([TensorType([1, 1, 6, 1], float32), TensorType([1, 1, 6, 1], float32)]))], TensorType([1, 1, 6, 2], float32), CallNode(Op(concatenate), [Var(p0, ty=TupleTypeNode([TensorType([1, 1, 6, 1], float32), TensorType([1, 1, 6, 1], float32)]))], relay.attrs.ConcatenateAttrs(0xa8c1f68), [TupleTypeNode([TensorType([1, 1, 6, 1], float32), TensorType([1, 1, 6, 1], float32)])]), [], {"Primitive": 1}), [Tuple([Constant([[[[0.5]
   [0.5]
   [0.5]
   [0.5]
   [0.5]
   [0.5]]]]), Constant([[[[0.5]
   [0.5]
   [0.5]
   [0.5]
   [0.5]
   [0.5]]]])])], (nullptr), [])

The reason it failed is that when detecting feature, there are two Constant with the same ObjectHash. Thus feature detector checked whether the latter one is atomic and return false. https://github.com/apache/incubator-tvm/blob/master/src/relay/analysis/feature.cc#L48

I can bypass this issue by commenting out that check. How can we systematically fix this?

Update a minimal sample:

import numpy as np
import tvm

from tvm import relay
from tvm.ir import IRModule

data = tvm.nd.array(np.array([1.0, 2.0, 3.0]))
const = relay.expr.Constant(data)
out = relay.op.concatenate([const, const], axis=0)
mod = IRModule()
mod["main"] = relay.Function([], out)

with relay.build_config(opt_level=3):
    vm_exec = relay.vm.compile(mod, target="llvm", params={})

@tqchen @jroesch @haichen @zhiics

I think this check should be checking equality probably not just the hash code, also it seems like the intent here should be fine ignoring constants as they don’t effect the program at all. The goal of this check is to stop people from running graph programs on the interpreter w/o running ANF because it will cause exponential blowup. cc @MarisaKirisame should have a good answer.

Sorry for the late followup. Is this still a problem?

No it has been fixed. Thx!

hi~ I met the same problem. Could you kindly share how you fix that with me?