[TensorFlow] Why is AddShapesToGraphDef('softmax') needed?

Hi, I’m new to TVM.

I followed an example to compile TensorFlow model:
https://docs.tvm.ai/tutorials/nnvm/from_tensorflow.html#sphx-glr-tutorials-nnvm-from-tensorflow-py

with tf.gfile.FastGFile(os.path.join("./", model_name), 'rb') as f:
    graph_def = tf.GraphDef()
    graph_def.ParseFromString(f.read())
    graph = tf.import_graph_def(graph_def, name='')
    # Call the utility to import the graph definition into default graph.
    graph_def = nnvm.testing.tf.ProcessGraphDefParam(graph_def)
    # Add shapes to the graph.
    graph_def = nnvm.testing.tf.AddShapesToGraphDef('softmax')

I have questions for the above:

  1. Why is the last line with AddShapesToGraphDef(‘softmax’) needed? Did the original InceptionNet have no softmax?
  2. If I want to import AlexNet, do I need AddShapesToGraphDef(‘softmax’)?

By the way,
I’m not sure why my post were flagged.
Could I have some reasons?

Thanks!

@sagi1210 I’m not very familiar with this part of the library, but does https://github.com/tensorflow/tensorflow/issues/3903#issuecomment-240776254 help? It seems like the importer uses the shape information created by this function to inform NNVM further on the correct symbol construction steps (e.g. https://github.com/dmlc/tvm/blob/3bfa5fc03feb58958feaab40f9734f176f7082e8/nnvm/python/nnvm/frontend/tensorflow.py#L122, etc). So if your serialized AlexNet tf protobuf doesn’t have this information stored in the graph, then you will need this call as well.