What inputs does Relay expect?

Hi all,

I read the documentation and also the official TVM guides on compiling ONNX models, but I’m still having trouble with getting my ONNX model to compile with TVM. I created an ONNX model trained on breast cancer data. Here is the code:

import onnx
import numpy as np
import tvm
from tvm import te
import tvm.relay as relay
import mlflow

# Load a previously converted model. This model was converted from skl to ONNX with Hummingbird.
onnx_model = mlflow.onnx.load_model("\onnx_model")
ml_model = mlflow.models.Model.load("\onnx_model")
input_example =  ml_model.load_input_example("\onnx_model")

# Code below is from TVM ONNX example in their official docs 
# https://tvm.apache.org/docs/how_to/compile_models/from_onnx.html#compile-the-model-with-relay

target = "llvm"

input_name = "1"
shape_dict = {input_name: input_example}

# Line below throws errors about the shape

mod, params = relay.frontend.from_onnx(onnx_model, shape_dict)

with tvm.transform.PassContext(opt_level=1):

    executor = relay.build_module.create_executor(
        "graph", mod, tvm.cpu(0), target, params

and I get this error:

I’m a bit confused at the shape dictionary and what to make my shape dictionary? Thanks so much for your time, Chloe

Maybe you could try netron to view your model like this.

1 Like

Thank you! If I see the input as float64[sym, 30], what does this mean?

It means this model uses an input with dynamic shape (usually because this model is from pytorch. For example, the mobilenetv2 in onnx official model zoo is from pytorch 1.8, thus its input is float32[batch_size, 3, 224, 224].).

But the compilor needs a static shape, so you need to specify the input shape by passing one sample input. Also you need to pass the dtype of the model since the default dtype is “float32” but yours is “float64”.

You can see the detail of the function here, or just click the function in the document.

1 Like

I was able to fix the error by making my shape_dict: {“input_0” : input_example.shape}! Thank you.