How to employ TVM on my own Keras CNN model?

I trained a Keras model which has 48 48 3 input size and 43 classifiers ,load the “model.h5” file like the tutorial ‘Compile Keras Models’ did and when execute on TVM there have the errors. The errors are about the model and seems a conflict with the library. So how could i employ TVM to my own model? Thanks.

The error log is:
The error log is:
TVMError Traceback (most recent call last)
in
1 dtype = ‘float32’
2 stime = time()
----> 3 tvm_out = executor.evaluate(func)(tvm.nd.array(data.astype(dtype)), **params)
4 top1_tvm = np.argmax(tvm_out.asnumpy()[0])
5 print(“time cost %s” %(time()-stime))

~/research/tvm/python/tvm/relay/backend/interpreter.py in evaluate(self, expr, binds)
209
210 if isinstance(expr, (Function, GlobalVar)):
–> 211 return self._make_executor(expr)
212
213 # normal expression evaluated by running a function.

~/research/tvm/python/tvm/relay/build_module.py in _make_executor(self, func)
398
399 def _make_executor(self, func):
–> 400 graph_json, mod, params = build(func, target=self.target)
401 gmodule = _graph_rt.create(graph_json, mod, self.ctx)
402 if params:

~/research/tvm/python/tvm/relay/build_module.py in build(func, target, target_host, params)
260
261 with tophub_context:
–> 262 func = optimize(func, target, params)
263 # Annotate the ops for heterogeneous execution.
264 if isinstance(target, dict):

~/research/tvm/python/tvm/relay/build_module.py in optimize(func, target, params)
159
160 if cfg.pass_enabled(“SimplifyInference”):
–> 161 func = ir_pass.infer_type(func)
162 func = ir_pass.simplify_inference(func)
163

~/research/tvm/python/tvm/relay/ir_pass.py in infer_type(expr, mod)
43 The checked expression.
44 “”"
—> 45 return _ir_pass.infer_type(expr, mod)
46
47

~/research/tvm/python/tvm/_ffi/_ctypes/function.py in call (self, *args)
183 check_call(_LIB.TVMFuncCall(
184 self.handle, values, tcodes, ctypes.c_int(num_args),
–> 185 ctypes.byref(ret_val), ctypes.byref(ret_tcode)))
186 _ = temp_args
187 _ = args

~/research/tvm/python/tvm/_ffi/base.py in check_call(ret)
69 “”"
70 if ret != 0:
—> 71 raise TVMError(py_str(_LIB.TVMGetLastError()))
72
73

TVMError: [17:36:32] /home/shaonanl/research/tvm/src/relay/ir/error.cc:112:
Error(s) have occurred. We have annotated the program with them:

In main :
fn (%main_input,
%v_param_1: Tensor[(32, 3, 3, 3), float32],
%v_param_2: Tensor[(32,), float32],
%v_param_3: Tensor[(32, 32, 3, 3), float32],
%v_param_4: Tensor[(32,), float32],
%v_param_5: Tensor[(64, 32, 3, 3), float32],
%v_param_6: Tensor[(64,), float32],
%v_param_7: Tensor[(64, 64, 3, 3), float32],
%v_param_8: Tensor[(64,), float32],
%v_param_9: Tensor[(128, 64, 3, 3), float32],
%v_param_10: Tensor[(128,), float32],
%v_param_11: Tensor[(128, 128, 3, 3), float32],
%v_param_12: Tensor[(128,), float32],
%v_param_13: Tensor[(512, 2048), float32],
%v_param_14: Tensor[(512,), float32],
%v_param_15: Tensor[(43, 512), float32],
%v_param_16: Tensor[(43,), float32]) {
%0 = nn.conv2d(%main_input, %v_param_1, padding=[1, 1], channels=32, kernel_size=[3, 3]) #
%1 = nn.bias_add(%0, %v_param_2) #
%2 = nn.relu(%1) #
%3 = nn.conv2d(%2, %v_param_3, channels=32, kernel_size=[3, 3]) #
%4 = nn.bias_add(%3, %v_param_4) #
%5 = nn.relu(%4) #
%6 = nn.max_pool2d(%5, pool_size=[2, 2], strides=[2, 2]) # an internal invariant was violdated whiletypechecking your program[17:36:32] /home/shaonanl/research/tvm/src/relay/op/nn/pooling.cc:53: Check failed: data != nullptr

Stack trace returned 10 entries:
[bt] (0) /home/shaonanl/research/tvm/build/libtvm.so(+0x1a4c13d) [0x7fb64239b13d]
[bt] (1) /home/shaonanl/research/tvm/build/libtvm.so(+0x1a4cd8d) [0x7fb64239bd8d]
[bt] (2) /home/shaonanl/research/tvm/build/libtvm.so(+0x1e4fd74) [0x7fb64279ed74]
[bt] (3) /home/shaonanl/research/tvm/build/libtvm.so(+0x1df3014) [0x7fb642742014]
[bt] (4) /home/shaonanl/research/tvm/build/libtvm.so(+0x1f5f3b7) [0x7fb6428ae3b7]
[bt] (5) /home/shaonanl/research/tvm/build/libtvm.so(+0x1f46d43) [0x7fb642895d43]
[bt] (6) /home/shaonanl/research/tvm/build/libtvm.so(tvm::relay::InferType(tvm::relay::Function const&, tvm::relay::Module const&, tvm::relay::GlobalVar const&)+0x32b) [0x7fb642896aab]
[bt] (7) /home/shaonanl/research/tvm/build/libtvm.so(+0x1db18c0) [0x7fb6427008c0]
[bt] (8) /home/shaonanl/research/tvm/build/libtvm.so(+0x1db2476) [0x7fb642701476]
[bt] (9) /home/shaonanl/research/tvm/build/libtvm.so(tvm::relay::InferType(tvm::relay::Expr const&, tvm::relay::Module const&)+0x41d) [0x7fb6428963ed]

;
%7 = nn.conv2d(%6, %v_param_5, padding=[1, 1], channels=64, kernel_size=[3, 3]) #
%8 = nn.bias_add(%7, %v_param_6) #
%9 = nn.relu(%8) #
%10 = nn.conv2d(%9, %v_param_7, channels=64, kernel_size=[3, 3]) #
%11 = nn.bias_add(%10, %v_param_8) #
%12 = nn.relu(%11) #
%13 = nn.max_pool2d(%12, pool_size=[2, 2], strides=[2, 2]) # an internal invariant was violdated whiletypechecking your program[17:36:32] /home/shaonanl/research/tvm/src/relay/op/nn/pooling.cc:53: Check failed: data != nullptr

Stack trace returned 10 entries:
[bt] (0) /home/shaonanl/research/tvm/build/libtvm.so(+0x1a4c13d) [0x7fb64239b13d]
[bt] (1) /home/shaonanl/research/tvm/build/libtvm.so(+0x1a4cd8d) [0x7fb64239bd8d]
[bt] (2) /home/shaonanl/research/tvm/build/libtvm.so(+0x1e4fd74) [0x7fb64279ed74]
[bt] (3) /home/shaonanl/research/tvm/build/libtvm.so(+0x1df3014) [0x7fb642742014]
[bt] (4) /home/shaonanl/research/tvm/build/libtvm.so(+0x1f5f3b7) [0x7fb6428ae3b7]
[bt] (5) /home/shaonanl/research/tvm/build/libtvm.so(+0x1f46d43) [0x7fb642895d43]
[bt] (6) /home/shaonanl/research/tvm/build/libtvm.so(tvm::relay::InferType(tvm::relay::Function const&, tvm::relay::Module const&, tvm::relay::GlobalVar const&)+0x32b) [0x7fb642896aab]
[bt] (7) /home/shaonanl/research/tvm/build/libtvm.so(+0x1db18c0) [0x7fb6427008c0]
[bt] (8) /home/shaonanl/research/tvm/build/libtvm.so(+0x1db2476) [0x7fb642701476]
[bt] (9) /home/shaonanl/research/tvm/build/libtvm.so(tvm::relay::InferType(tvm::relay::Expr const&, tvm::relay::Module const&)+0x41d) [0x7fb6428963ed]

;
%14 = nn.conv2d(%13, %v_param_9, padding=[1, 1], channels=128, kernel_size=[3, 3]) #
%15 = nn.bias_add(%14, %v_param_10) #
%16 = nn.relu(%15) #
%17 = nn.conv2d(%16, %v_param_11, channels=128, kernel_size=[3, 3]) #
%18 = nn.bias_add(%17, %v_param_12) #
%19 = nn.relu(%18) #
%20 = nn.max_pool2d(%19, pool_size=[2, 2], strides=[2, 2]) # an internal invariant was violdated whiletypechecking your program[17:36:32] /home/shaonanl/research/tvm/src/relay/op/nn/pooling.cc:53: Check failed: data != nullptr

Stack trace returned 10 entries:
[bt] (0) /home/shaonanl/research/tvm/build/libtvm.so(+0x1a4c13d) [0x7fb64239b13d]
[bt] (1) /home/shaonanl/research/tvm/build/libtvm.so(+0x1a4cd8d) [0x7fb64239bd8d]
[bt] (2) /home/shaonanl/research/tvm/build/libtvm.so(+0x1e4fd74) [0x7fb64279ed74]
[bt] (3) /home/shaonanl/research/tvm/build/libtvm.so(+0x1df3014) [0x7fb642742014]
[bt] (4) /home/shaonanl/research/tvm/build/libtvm.so(+0x1f5f3b7) [0x7fb6428ae3b7]
[bt] (5) /home/shaonanl/research/tvm/build/libtvm.so(+0x1f46d43) [0x7fb642895d43]
[bt] (6) /home/shaonanl/research/tvm/build/libtvm.so(tvm::relay::InferType(tvm::relay::Function const&, tvm::relay::Module const&, tvm::relay::GlobalVar const&)+0x32b) [0x7fb642896aab]
[bt] (7) /home/shaonanl/research/tvm/build/libtvm.so(+0x1db18c0) [0x7fb6427008c0]
[bt] (8) /home/shaonanl/research/tvm/build/libtvm.so(+0x1db2476) [0x7fb642701476]
[bt] (9) /home/shaonanl/research/tvm/build/libtvm.so(tvm::relay::InferType(tvm::relay::Expr const&, tvm::relay::Module const&)+0x41d) [0x7fb6428963ed]

;
%21 = transpose(%20, axes=[0, 2, 3, 1]) #
%22 = nn.batch_flatten(%21) #
%23 = nn.dense(%22, %v_param_13, units=512) #
%24 = nn.bias_add(%23, %v_param_14) #
%25 = nn.relu(%24) #
%26 = nn.dense(%25, %v_param_15, units=43) #
%27 = nn.bias_add(%26, %v_param_16) #
%28 = nn.softmax(%27, axis=1) #
%28
}

Stack trace returned 10 entries:
[bt] (0) /home/shaonanl/research/tvm/build/libtvm.so(+0x1a4c13d) [0x7fb64239b13d]
[bt] (1) /home/shaonanl/research/tvm/build/libtvm.so(+0x1a4cd8d) [0x7fb64239bd8d]
[bt] (2) /home/shaonanl/research/tvm/build/libtvm.so(+0x1d853f5) [0x7fb6426d43f5]
[bt] (3) /home/shaonanl/research/tvm/build/libtvm.so(+0x1f46f7a) [0x7fb642895f7a]
[bt] (4) /home/shaonanl/research/tvm/build/libtvm.so(tvm::relay::InferType(tvm::relay::Function const&, tvm::relay::Module const&, tvm::relay::GlobalVar const&)+0x32b) [0x7fb642896aab]
[bt] (5) /home/shaonanl/research/tvm/build/libtvm.so(+0x1db18c0) [0x7fb6427008c0]
[bt] (6) /home/shaonanl/research/tvm/build/libtvm.so(+0x1db2476) [0x7fb642701476]
[bt] (7) /home/shaonanl/research/tvm/build/libtvm.so(tvm::relay::InferType(tvm::relay::Expr const&, tvm::relay::Module const&)+0x41d) [0x7fb6428963ed]
[bt] (8) /home/shaonanl/research/tvm/build/libtvm.so(+0x1f47637) [0x7fb642896637]
[bt] (9) /home/shaonanl/research/tvm/build/libtvm.so(TVMFuncCall+0x5e) [0x7fb642aa67ee]

My Keras model is:

Here is the model summary

Layer (type) Output Shape Param #

main_input (InputLayer) (None, 48, 48, 3) 0

conv1 (Conv2D) (None, 48, 48, 32) 896

conv2 (Conv2D) (None, 46, 46, 32) 9248

pool1 (MaxPooling2D) (None, 23, 23, 32) 0

do1 (Dropout) (None, 23, 23, 32) 0

conv3 (Conv2D) (None, 23, 23, 64) 18496

conv4 (Conv2D) (None, 21, 21, 64) 36928

pool2 (MaxPooling2D) (None, 10, 10, 64) 0

do2 (Dropout) (None, 10, 10, 64) 0

conv5 (Conv2D) (None, 10, 10, 128) 73856

conv6 (Conv2D) (None, 8, 8, 128) 147584

pool3 (MaxPooling2D) (None, 4, 4, 128) 0

do3 (Dropout) (None, 4, 4, 128) 0

flatten (Flatten) (None, 2048) 0

dense1 (Dense) (None, 512) 1049088

do4 (Dropout) (None, 512) 0

softmax (Dense) (None, 43) 22059

Total params: 1,358,155
Trainable params: 1,358,155
Non-trainable params: 0

None

Looks like for certain pooling layer the input data is none.

Is that mean my own model’s pooling layer lack the input data?

you should check your input, e.g.‘input_a’…

I agree with @yangglebin, just want to share a bit of my short experience with TVM: This kind of error

often (always?) happened to me when I was giving wrong information about the model to tvm, either input shape or input name. Most of the time it was the input shape (sometimes input shape needed some permutation, but I think it was just a bug with the tflite frontend that has been corrected since then).

If not using your model, you can use netron to retrieve input layer name and shape.

Okay,thank you so much.