Failed converting tensorflow ssd model to relay
Error log:
at /home/antoinette/tvm/src/relay/op/nn/pad.cc:129 File "/home/antoinette/tvm/src/relay/analysis/type_solver.cc", line 624 TVMError: --------------------------------------------------------------- An error occurred during the execution of TVM. For more information, please see: https://tvm.apache.org/docs/errors.html --------------------------------------------------------------- Check failed: (false) is false: [23:11:05] /home/antoinette/tvm/src/relay/op/nn/pad.cc:129: --------------------------------------------------------------- An error occurred during the execution of TVM. For more information, please see: https://tvm.apache.org/docs/errors.html --------------------------------------------------------------- Check failed: (data->shape.size() == param->pad_width.size()) is false: There should be as many pad width pairs as shape dimensions but the shape has 0 dimensions and there are 4 pad width pairs.
The model can be found here:
https://storage.googleapis.com/intel-optimized-tensorflow/models/v2_3_0/ssd-resnet34-fp32-inference.tar.gz
/ssd-resnet34-fp32-inference/pretrained_model/ssd_resnet34_fp32_bs1_pretrained_model.pb
The script to reproduce:
import tvm
from tvm import te
from tvm import relay
import numpy as np
import os.path
import tensorflow as tf
try:
tf_compat_v1 = tf.compat.v1
except ImportError:
tf_compat_v1 = tf
import tvm.relay.testing.tf as tf_testing
layout = "NCHW"
with tf_compat_v1.gfile.GFile("/home/antoinette/TLCBench/models/ssd-resnet34-fp32-inference/pretrained_model/ssd_resnet34_fp32_bs1_pretrained_model.pb", "rb") as f:
graph_def = tf_compat_v1.GraphDef()
graph_def.ParseFromString(f.read())
graph = tf.import_graph_def(graph_def, name="")
graph_def = tf_testing.ProcessGraphDefParam(graph_def)
with tf_compat_v1.Session() as sess:
graph_def = tf_testing.AddShapesToGraphDef(sess, out_node = ["v/Softmax", "v/stack"])
mod, params = relay.frontend.from_tensorflow(graph_def, layout=layout)
Can anyone help, please?