Hello,
I am trying to convert an ONNX model to TVM relay format using the Python API.
import onnx
import tvm.relay as relay
import tvm
from tvm.contrib import graph_executor
onnx_model = onnx.load("models/model_float.onnx")
target = "llvm"
input_name = "data"
shape_dict = {
"image": [1, 1, 256, 192],
"rotation_normalized_to_world": [1, 3, 3],
"principal_point_normalized": [1, 2],
"focal_length_normalized": [1, 2],
}
mod, params = relay.frontend.from_onnx(onnx_model, shape_dict)
with tvm.transform.PassContext(opt_level=3):
executor = relay.build_module.create_executor("vm", mod, tvm.cpu(0), target, params).evaluate()
But when I run this i stumble upon a TypeError in TVM:
Output (cropped):
...
5: tvm::runtime::ObjectPtr<tvm::runtime::Object> tvm::runtime::Array<tvm::tir::Stmt, void>::MapHelper<tvm::tir::StmtMutator::Internal::Mutate(tvm::tir::StmtMutator*, tvm::runtime::Array<tvm::tir::Stmt, void> const&)::{lambda(tvm::tir::Stmt const&)#1}, tvm::tir::Stmt>(tvm::runtime::ObjectPtr<tvm::runtime::Object>, tvm::tir::StmtMutator::Internal::Mutate(tvm::tir::StmtMutator*, tvm::runtime::Array<tvm::tir::Stmt, void> const&)::{lambda(tvm::tir::Stmt const&)#1})
4: tvm::tir::StmtMutator::VisitStmt(tvm::tir::Stmt const&)
3: tvm::tir::StmtFunctor<tvm::tir::Stmt (tvm::tir::Stmt const&)>::VisitStmt(tvm::tir::Stmt const&)
2: _ZZN3tvm3tir11StmtFunctorIFNS0_4StmtERKS2_EE10InitVTableEvENUlRKNS_7runti
1: tvm::te::TensorToBufferMapper::VisitStmt_(tvm::tir::ProducerStoreNode const*)
0: tvm::tir::BufferStore::BufferStore(tvm::tir::Buffer, tvm::PrimExpr, tvm::runtime::Array<tvm::PrimExpr, void>, tvm::Span)
File "/home/jv1941/workspace/tvm/src/tir/ir/stmt.cc", line 480
TypeError: dtype mismatch on BufferStore: buffer's dtype is `int32`, the lanes of indexing are: `1`, but RHS's dtype is `int64`
Any ideas of how to solve this?
TVM version used: v0.13.0
BR Johan