I am trying to convert a tf.keras
model to TVM with:
import numpy as np
import tvm
from tvm import te
import tvm.relay as relay
import tensorflow as tf
model = tf.keras.models.load_model("my_model")
shape_dict={'features:0':(None, 1, 104, 64)}
mod, params = relay.frontend.from_keras(model, shape_dict)
with tvm.transform.PassContext(opt_level=1):
intrp = relay.build_module.create_executor("graph", mod, tvm.cpu(0), "llvm")
feats = np.random.rand(1, 1, 104, 64).astype(np.float32)
tvm_input = tvm.nd.array(feats)
tvm_output = intrp.evaluate()(tvm_input, **params).asnumpy()
When running evaluate()
, I’m getting the following errors:
The type inference pass was unable to infer a type for this expression.
This usually occurs when an operator call is under constrained in some way, check other reported errors for hints of what may of happened.
The type inference pass was unable to infer a type for this expression.
This usually occurs when an operator call is under constrained in some way, check other reported errors for hints of what may of happened.
The type inference pass was unable to infer a type for this expression.
This usually occurs when an operator call is under constrained in some way, check other reported errors for hints of what may of happened.
The type inference pass was unable to infer a type for this expression.
...
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "/home/tblstri/tvm/python/tvm/relay/backend/interpreter.py", line 172, in evaluate
return self._make_executor()
File "/home/tblstri/tvm/python/tvm/relay/build_module.py", line 382, in _make_executor
self.mod = InferType()(self.mod)
File "/home/tblstri/tvm/python/tvm/ir/transform.py", line 127, in __call__
return _ffi_transform_api.RunPass(self, mod)
File "/home/tblstri/tvm/python/tvm/_ffi/_ctypes/packed_func.py", line 237, in __call__
raise get_last_ffi_error()
tvm.error.DiagnosticError: Traceback (most recent call last):
[bt] (6) /home/tblstri/tvm/build/libtvm.so(TVMFuncCall+0x65) [0x7f7dd407cf05]
[bt] (5) /home/tblstri/tvm/build/libtvm.so(+0x72ec62) [0x7f7dd35b7c62]
[bt] (4) /home/tblstri/tvm/build/libtvm.so(tvm::transform::ModulePassNode::operator()(tvm::IRModule, tvm::transform::PassContext const&) const+0x1b7) [0x7f7dd35b7517]
[bt] (3) /home/tblstri/tvm/build/libtvm.so(+0x102142f) [0x7f7dd3eaa42f]
[bt] (2) /home/tblstri/tvm/build/libtvm.so(+0x10206ed) [0x7f7dd3ea96ed]
[bt] (1) /home/tblstri/tvm/build/libtvm.so(tvm::DiagnosticContext::Render()+0x199) [0x7f7dd356c559]
[bt] (0) /home/tblstri/tvm/build/libtvm.so(+0x6e2442) [0x7f7dd356b442]
File "/home/tblstri/tvm/src/ir/diagnostic.cc", line 105
DiagnosticError: one or more error diagnostics were emitted, please check diagnostic render for output.
Any idea why I can’t run a forward pass with the converted model?