The type inference pass was unable to infer a type for this expression

I am trying to convert a tf.keras model to TVM with:

import numpy as np
import tvm
from tvm import te
import tvm.relay as relay
import tensorflow as tf

model = tf.keras.models.load_model("my_model")
shape_dict={'features:0':(None, 1, 104, 64)}
mod, params = relay.frontend.from_keras(model, shape_dict)
with tvm.transform.PassContext(opt_level=1):
    intrp = relay.build_module.create_executor("graph", mod, tvm.cpu(0), "llvm")
feats = np.random.rand(1, 1, 104, 64).astype(np.float32)
tvm_input = tvm.nd.array(feats)
tvm_output = intrp.evaluate()(tvm_input, **params).asnumpy()

When running evaluate(), I’m getting the following errors:

The type inference pass was unable to infer a type for this expression.
This usually occurs when an operator call is under constrained in some way, check other reported errors for hints of what may of happened.
The type inference pass was unable to infer a type for this expression.
This usually occurs when an operator call is under constrained in some way, check other reported errors for hints of what may of happened.
The type inference pass was unable to infer a type for this expression.
This usually occurs when an operator call is under constrained in some way, check other reported errors for hints of what may of happened.
The type inference pass was unable to infer a type for this expression.
...
Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
  File "/home/tblstri/tvm/python/tvm/relay/backend/interpreter.py", line 172, in evaluate
    return self._make_executor()
  File "/home/tblstri/tvm/python/tvm/relay/build_module.py", line 382, in _make_executor 
    self.mod = InferType()(self.mod)
  File "/home/tblstri/tvm/python/tvm/ir/transform.py", line 127, in __call__
    return _ffi_transform_api.RunPass(self, mod)
  File "/home/tblstri/tvm/python/tvm/_ffi/_ctypes/packed_func.py", line 237, in __call__
    raise get_last_ffi_error()
tvm.error.DiagnosticError: Traceback (most recent call last):
  [bt] (6) /home/tblstri/tvm/build/libtvm.so(TVMFuncCall+0x65) [0x7f7dd407cf05]
  [bt] (5) /home/tblstri/tvm/build/libtvm.so(+0x72ec62) [0x7f7dd35b7c62]
  [bt] (4) /home/tblstri/tvm/build/libtvm.so(tvm::transform::ModulePassNode::operator()(tvm::IRModule, tvm::transform::PassContext const&) const+0x1b7) [0x7f7dd35b7517]
  [bt] (3) /home/tblstri/tvm/build/libtvm.so(+0x102142f) [0x7f7dd3eaa42f]
  [bt] (2) /home/tblstri/tvm/build/libtvm.so(+0x10206ed) [0x7f7dd3ea96ed]
  [bt] (1) /home/tblstri/tvm/build/libtvm.so(tvm::DiagnosticContext::Render()+0x199) [0x7f7dd356c559]
  [bt] (0) /home/tblstri/tvm/build/libtvm.so(+0x6e2442) [0x7f7dd356b442]
  File "/home/tblstri/tvm/src/ir/diagnostic.cc", line 105
DiagnosticError: one or more error diagnostics were emitted, please check diagnostic render for output.

Any idea why I can’t run a forward pass with the converted model?

1 Like

Hi, can you print(intrp.mod) before evaluating the model? I had a similar problem and I solved it by changing the string label in shape_dict(in your case features:0) and renaming it with the first layer name of your keras model. Hope it helps

Thanks. Changing the key in shape_dict to features got rid of that error, but now I’m seeing another one:

The Relay type checker is unable to show the following types match.
In particular dimension 0 conflicts: 1 does not match 64.
The Relay type checker is unable to show the following types match.
In particular `Tensor[(1), float32]` does not match `Tensor[(64), float32]`
The Relay type checker is unable to show the following types match.
In particular dimension 0 conflicts: 1 does not match 64.
The Relay type checker is unable to show the following types match.
In particular `Tensor[(1), float32]` does not match `Tensor[(64), float32]`
The Relay type checker is unable to show the following types match.
In particular dimension 0 conflicts: 1 does not match 64.
The Relay type checker is unable to show the following types match.
In particular `Tensor[(1), float32]` does not match `Tensor[(64), float32]`
The Relay type checker is unable to show the following types match.
In particular dimension 0 conflicts: 1 does not match 64.
The Relay type checker is unable to show the following types match.
In particular `Tensor[(1), float32]` does not match `Tensor[(64), float32]`
The Relay type checker is unable to show the following types match.
In particular dimension 1 conflicts: 0 does not match 1.
The Relay type checker is unable to show the following types match.
In particular `Tensor[(64, 0, 1, 11), float32]` does not match `Tensor[(64, 1, 1, 11), float32]`

It’s not clear to me why the shapes aren’t matching in the type checker.

Bumped into the same error. In my case the issue was that my network used layout ‘NHWC’ whereas the default of relay.frontend.from_keras() is ‘NCHW’ Two solutions: (1) Specify layout=‘NHWC’ in relay.frontend.from_keras() Disadvantage: TVM prints warnings that layout=‘NHWC’ is performance-wise worse for conv2d (2) Specify input shape accordant to ‘NCHW’, in your case: (None, 64, 1, 104) N.B. Unsure whether ‘None’ is accepted though. I used a hard-coded number (1, batch size, in my case)