I downloaded TF SSD quantized model ssd_mobilenet_v1_quantized_coco
from
Tensorflow Model Zoo
The zip file contains tflite_graph.pb
I used tflite_convert
util to convert tflite_graph.pb
to model.tflite
file
tflite_convert \
--graph_def_file=/home/ubuntu/ssd_mobilenet_v1_quantized_300x300_coco14_sync_2018_07_18/tflite_graph.pb \
--output_file=/home/ubuntu/ssd_mobilenet_v1_quantized_300x300_coco14_sync_2018_07_18/model.tflite \
--output_format=TFLITE \
--input_arrays="normalized_input_image_tensor" \
--input_shapes="1,300,300,3" \
--inference_type=QUANTIZED_UINT8 \
--mean_values=128 \
--std_dev_values=128 \
--output_arrays="TFLite_Detection_PostProcess,TFLite_Detection_PostProcess:1,TFLite_Detection_PostProcess:2,TFLite_Detection_PostProcess:3" \
--allow_custom_ops
SSD Mobilenet model.tflite
Summary:
### Ops:
# id: name - count
2: CONCATENATION - 2
3: CONV_2D - 34
4: DEPTHWISE_CONV_2D - 13
14: LOGISTIC - 1
22: RESHAPE - 13
32: CUSTOM - 1
When I tried to compile model.tflite
using TVM relay I got the following error
Traceback (most recent call last):
File "./compile.py", line 62, in <module>
dtype_dict={input_tensor: input_dtype})
File "/usr/local/lib/python3.5/dist-packages/tvm-0.6.dev0-py3.5-linux-x86_64.egg/tvm/relay/frontend/tflite.py", line 709, in from_tflite
op_converter.check_unsupported_ops()
File "/usr/local/lib/python3.5/dist-packages/tvm-0.6.dev0-py3.5-linux-x86_64.egg/tvm/relay/frontend/tflite.py", line 84, in check_unsupported_ops
raise tvm.error.OpNotImplemented(msg.format(ops))
tvm.error.OpNotImplemented: The following operators are not supported in frontend TFLite: 'LOGISTIC', 'CUSTOM'
Relay does not support 2 operators - 1 built-in and 1 custom
built-in - LOGISTIC
(op_code_id = 14)
custom - TFLite_Detection_PostProcess
(op_code_id = 32)
Looks like tf.sigmoid
op in tflite_graph.pb
is mapped to tflite LOGISTIC
op.
LOGISTIC
is generalized sigmoid
.
TFLite_Detection_PostProcess
is custom op which produces final 4 outputs - (classes, scores, bboxes and num_outputs)
Questions:
My first questions are:
-
Do you think it makes sense to add TFLite built-it
LOGISTIC
operator to relay? It should be similar totf.sigmoid
-
The Same question for custom operator
TFLite_Detection_PostProcess
.
This operator is the last operator in TFLite graph in all SSD models from TF model zoo.
BTW, custom operator TFLite_Detection_PostProcess
is optional.
export_tflite_ssd_graph.py
which converts checkpoint to TFLite compatible pb file has parameter add_postprocessing_op=true/false
.
It is possible to create tflite_graph.pb
without TFLite_Detection_PostProcess
in that case the model output will be
the graph has two outputs:
raw_outputs/box_encodings: a float32 tensor of shape [1, num_anchors, 4]
containing the encoded box predictions.
raw_outputs/class_predictions: a float32 tensor of shape
[1, num_anchors, num_classes] containing the class scores for each anchor
after applying score conversion.
Resnet
I also tried to compile SSD Resnet 50 model (ssd_resnet_50_fpn_coco
)
It complains about the same two operators plus two additional operators: MUL
and PAD
SSD Resnet 50 tflite summary:
0: ADD - 58
2: CONCATENATION - 2
3: CONV_2D - 110
14: LOGISTIC - 1
17: MAX_POOL_2D - 4
18: MUL - 42
22: RESHAPE - 14
32: CUSTOM - 1
34: PAD - 1
Visualisation of TFLite graphs are below:
TFLite model graph for ssd_mobilenet_v1_quantized_coco
(quantized, Uint8):
ssd_mobilenet_v1_quantized_300x300_coco14_sync_2018_07_18.tflite.pdf
I also tried to compile non-quantized tflite model. The same 2 operators are missing.
TFLite model graph for ssd_mobilenet_v1_coco_2018_01_28
(non-quantized, float):
ssd_mobilenet_v1_coco_2018_01_28.tflite.pdf
SSD Resnet 50 FPN COCO model graph
ssd_resnet_50_fpn_coco.tflite.pdf