Keras Concatenate/Conv2D/Flatten/Dense - Bad calculations on weights? relay confused and fails?

I’m trying to use a working ‘channel stacked’ model with relay. This dies on the first Conv2D after a Concatenate and then on a Dense after a Flatten. It seems to compute the shapes incorrectly.

Most interesting line (1/2) in error: %2 = nn.conv2d(%1, %v_param_1, padding=[1, 1, 1, 1], channels=64, kernel_size=[3, 3]) in particular dimension 1 conflicts 100 does not match 2; unable to unify: Tensor[(64, 100, 3, 3), float32]andTensor[(64, 2, 3, 3), float32]; ;

Line (2/2) in error - later: %6 = nn.dense(%5, %v_param_3, units=256) in particular dimension 1 conflicts (int64)3200 does not match 160000; unable to unify: Tensor[(256, 3200), float32] and Tensor[(256, 160000), float32]; ;

Btw, I also tried this w/ my inputs as 1,50,50 (as opposed to 50,50,1 below) and the errors were similar.

This seemed vaguely similar to another error I had seen here on Keras - so I used their format to reproduce…

Thanks in advance!!! p

Here’s my example:

import numpy as np
import keras
from keras.layers import Input, Concatenate, Conv2D, Flatten, Dense
from keras.models import Model

# Simple Keras model
X1 = Input(shape=(50, 50, 1), name="X1")
X2 = Input(shape=(50, 50, 1), name="X2")
x = Concatenate()([X1,X2])
x = Conv2D(64,kernel_size=(3,3),padding='same')(x)      # 64 filters
model = keras.Model(inputs = [X1,X2], outputs = x, name='generator')
# normally more of the model here...
x = Flatten()(x)
x = Dense(256,activation='relu')(x)
# The keras model works fine
d1 = np.zeros([1,50,50,1])
d2 = np.zeros([1,50,50,1])
P = model.predict([d1,d2])

import tvm
import tvm.relay as relay

shape_dict = {'X1': [1,50,50,1], 'X2': [1,50,50,1] }
mod, params = relay.frontend.from_keras(model, shape_dict)

Output of last: --------------------------------------------------------------------------- TVMError Traceback (most recent call last) in 3 4 shape_dict = {‘X1’: [1,50,50,1], ‘X2’: [1,50,50,1] } ----> 5 mod, params = relay.frontend.from_keras(model, shape_dict)

~/repos/tvm/python/tvm/relay/frontend/keras.py in from_keras(model, shape, layout)
    917     func = _expr.Function(analysis.free_vars(outexpr), outexpr)
    918     params = {k:_nd.array(np.array(v, dtype=np.float32)) for k, v in etab.params.items()}
--> 919     return IRModule.from_expr(func), params

~/repos/tvm/python/tvm/ir/module.py in from_expr(expr, functions, type_defs)
    221         funcs = functions if functions is not None else {}
    222         defs = type_defs if type_defs is not None else {}
--> 223         return _ffi_api.Module_FromExpr(expr, funcs, defs)
    224 
    225     def _import(self, file_to_import):

~/repos/tvm/python/tvm/_ffi/_ctypes/packed_func.py in __call__(self, *args)
    211                 self.handle, values, tcodes, ctypes.c_int(num_args),
    212                 ctypes.byref(ret_val), ctypes.byref(ret_tcode)) != 0:
--> 213             raise get_last_ffi_error()
    214         _ = temp_args
    215         _ = args

TVMError: Traceback (most recent call last):
  [bt] (8) /home/ace/repos/tvm/build/libtvm.so(TVMFuncCall+0x65) [0x7f4ba7023715]
  [bt] (7) /home/ace/repos/tvm/build/libtvm.so(+0x2971ba) [0x7f4ba68481ba]
  [bt] (6) /home/ace/repos/tvm/build/libtvm.so(tvm::IRModule::FromExpr(tvm::RelayExpr const&, tvm::Map<tvm::GlobalVar, tvm::BaseFunc, void, void> const&, tvm::Map<tvm::GlobalTypeVar, tvm::TypeData, void, void> const&)+0x23b) [0x7f4ba684447b]
  [bt] (5) /home/ace/repos/tvm/build/libtvm.so(tvm::IRModuleNode::Add(tvm::GlobalVar const&, tvm::BaseFunc const&, bool)+0x43e) [0x7f4ba684405e]
  [bt] (4) /home/ace/repos/tvm/build/libtvm.so(tvm::RunTypeCheck(tvm::IRModule const&, tvm::GlobalVar const&, tvm::relay::Function)+0x376) [0x7f4ba683e756]
  [bt] (3) /home/ace/repos/tvm/build/libtvm.so(tvm::relay::InferType(tvm::relay::Function const&, tvm::IRModule const&, tvm::GlobalVar const&)+0x1dc) [0x7f4ba6e9c59c]
  [bt] (2) /home/ace/repos/tvm/build/libtvm.so(tvm::relay::TypeInferencer::Infer(tvm::RelayExpr)+0x73) [0x7f4ba6e9bd63]
  [bt] (1) /home/ace/repos/tvm/build/libtvm.so(tvm::ErrorReporter::RenderErrors(tvm::IRModule const&, bool)+0x1e16) [0x7f4ba6832b16]
  [bt] (0) /home/ace/repos/tvm/build/libtvm.so(+0x23ef61) [0x7f4ba67eff61]
  File "/home/ace/repos/tvm/src/ir/error.cc", line 133
TVMError: 
Error(s) have occurred. The program has been annotated with them:

In `main`: 
v0.0.4
fn (%X1: Tensor[(1, 50, 50, 1), float32], %X2: Tensor[(1, 50, 50, 1), float32], %v_param_1: Tensor[(64, 2, 3, 3), float32], %v_param_2: Tensor[(64), float32], %v_param_3: Tensor[(256, 160000), float32], %v_param_4: Tensor[(256), float32]) {
  %0 = (%X1, %X2);
  %1 = concatenate(%0, axis=1);
  --> %2 = nn.conv2d(%1, %v_param_1, padding=[1, 1, 1, 1], channels=64, kernel_size=[3, 3]) in particular dimension 1 conflicts 100 does not match 2; unable to unify: `Tensor[(64, 100, 3, 3), float32]` and `Tensor[(64, 2, 3, 3), float32]`; ;
  %3 = nn.bias_add(%2, %v_param_2);
  %4 = transpose(%3, axes=[0, 2, 3, 1]);
  %5 = nn.batch_flatten(%4);
  --> %6 = nn.dense(%5, %v_param_3, units=256) in particular dimension 1 conflicts (int64)3200 does not match 160000; unable to unify: `Tensor[(256, 3200), float32]` and `Tensor[(256, 160000), float32]`; ;
  %7 = nn.bias_add(%6, %v_param_4);
  nn.relu(%7)
}