Support for dynamic_rnn tensorflow

Hi, I am very new to apache TVM and deep learning. So please bear with my novice questions.

On the page https://tvm.apache.org/docs/dev/frontend/tensorflow.html#best-practices, it suggests to use static_rnn instead of dynamic_rnn while importing tensorflow models to apache TVM.

I wanted see it for myself whether apache tvm v.0.8 supports dynamic_rnn using tensorflow models. I tried to train a sample dynamic RNN model and saved the frozen inference model. When I tried to use tvmc compile on the tensorflow model, tvmc ended up with the following error ( 8 times) “The type inference pass was unable to infer a type for this expression. This usually occurs when an operator call is under constrained in some way, check other reported errors for hints of what may of happened”

Could you guide me whether apache tvm support dynamic_rnn or not?

Note I have used tensorflow v1.13.

Thanks in advance

I compile a network also use tf dynamic_rnn with TVM.

I meet an another error:

when convert this op in the frontend:

name: "rnn_1/gru1/while/TensorArrayWrite/TensorArrayWriteV3"
op: "TensorArrayWriteV3"
input: "rnn_1/gru1/while/TensorArrayWrite/TensorArrayWriteV3/Enter"
input: "rnn_1/gru1/while/Identity"
input: "rnn_1/gru1/while/Select"
input: "rnn_1/gru1/while/Identity_1"
attr {
  key: "T"
  value {
    type: DT_FLOAT
  }
}

it shows this error:

Traceback (most recent call last):
  File "/tvm_test_pb.py", line 44, in <module>
    mod, params = relay.frontend.from_tensorflow(graph_def, layout=None, shape=shape_dict)
  File "/tvm-2021.8.13/tvm/python/tvm/relay/frontend/tensorflow.py", line 1264, in from_tensorflow
    mod, params = g.from_tensorflow(graph, layout, shape, outputs)
  File "/tvm-2021.8.13/tvm/python/tvm/relay/frontend/tensorflow.py", line 659, in from_tensorflow
    func = self._get_relay_func(graph, layout=layout, shape=shape, outputs=outputs)
  File "/tvm-2021.8.13/tvm/python/tvm/relay/frontend/tensorflow.py", line 619, in _get_relay_func
    self._backtrack_construct(node.name)
  File "/tvm-2021.8.13/tvm/python/tvm/relay/frontend/tensorflow.py", line 1130, in _backtrack_construct
    node, [], attr, self._control_flow_node_map
  File "/tvm-2021.8.13/tvm/python/tvm/relay/frontend/tensorflow.py", line 895, in _convert_control_flow_operator
    op = self._licm_construct(plname, node.input[0])
  File "/tvm-2021.8.13/tvm/python/tvm/relay/frontend/tensorflow.py", line 1068, in _licm_construct
    actual_expr = self._backtrack_construct(node_name)
  File "/tvm-2021.8.13/tvm/python/tvm/relay/frontend/tensorflow.py", line 1183, in _backtrack_construct
    op = self._convert_operator(node.op, node.name, inputs, attr)
  File "/tvm-2021.8.13/tvm/python/tvm/relay/frontend/tensorflow.py", line 1023, in _convert_operator
    sym = convert_map[op_name](inputs, attrs, self._params, self._prelude)
  File "/tvm-2021.8.13/tvm/python/tvm/relay/frontend/tensorflow_ops.py", line 1660, in _impl
    input_t_shape = _infer_shape(inputs[2], prelude.mod)
  File "/tvm-2021.8.13/tvm/python/tvm/relay/frontend/common.py", line 512, in infer_shape
    out_type = infer_type(inputs, mod=mod)
  File "/tvm-2021.8.13/tvm/python/tvm/relay/frontend/common.py", line 479, in infer_type
    mod = _transform.InferType()(mod)
  File "/tvm-2021.8.13/tvm/python/tvm/ir/transform.py", line 161, in __call__
    return _ffi_transform_api.RunPass(self, mod)
  File "/tvm-2021.8.13/tvm/python/tvm/_ffi/_ctypes/packed_func.py", line 237, in __call__
    raise get_last_ffi_error()
tvm._ffi.base.TVMError: Traceback (most recent call last):
  7: TVMFuncCall
  6: std::_Function_handler<void (tvm::runtime::TVMArgs, tvm::runtime::TVMRetValue*), tvm::runtime::TypedPackedFunc<tvm::IRModule (tvm::transform::Pass, tvm::IRModule)>::AssignTypedLambda<tvm::transform::{lambda(tvm::transform::Pass, tvm::IRModule)#7}>(tvm::transform::{lambda(tvm::transform::Pass, tvm::IRModule)#7}, std::string)::{lambda(tvm::runtime::TVMArgs const&, tvm::runtime::TVMRetValue*)#1}>::_M_invoke(std::_Any_data const&, tvm::runtime::TVMArgs&&, tvm::runtime::TVMRetValue*&&)
  5: tvm::transform::Pass::operator()(tvm::IRModule) const
  4: tvm::transform::Pass::operator()(tvm::IRModule, tvm::transform::PassContext const&) const
  3: tvm::transform::ModulePassNode::operator()(tvm::IRModule, tvm::transform::PassContext const&) const
  2: std::_Function_handler<void (tvm::runtime::TVMArgs, tvm::runtime::TVMRetValue*), tvm::runtime::TypedPackedFunc<tvm::IRModule (tvm::IRModule, tvm::transform::PassContext)>::AssignTypedLambda<tvm::relay::transform::InferType()::{lambda(tvm::IRModule, tvm::transform::PassContext const&)#1}>(tvm::relay::transform::InferType()::{lambda(tvm::IRModule, tvm::transform::PassContext const&)#1})::{lambda(tvm::runtime::TVMArgs const&, tvm::runtime::TVMRetValue*)#1}>::_M_invoke(std::_Any_data const&, tvm::runtime::TVMArgs&&, tvm::runtime::TVMRetValue*&&)
  1: tvm::relay::TypeInferencer::Infer(tvm::GlobalVar, tvm::relay::Function)
  0: tvm::relay::TypeSolver::Solve()
  10: TVMFuncCall
  9: std::_Function_handler<void (tvm::runtime::TVMArgs, tvm::runtime::TVMRetValue*), tvm::runtime::TypedPackedFunc<tvm::IRModule (tvm::transform::Pass, tvm::IRModule)>::AssignTypedLambda<tvm::transform::{lambda(tvm::transform::Pass, tvm::IRModule)#7}>(tvm::transform::{lambda(tvm::transform::Pass, tvm::IRModule)#7}, std::string)::{lambda(tvm::runtime::TVMArgs const&, tvm::runtime::TVMRetValue*)#1}>::_M_invoke(std::_Any_data const&, tvm::runtime::TVMArgs&&, tvm::runtime::TVMRetValue*&&)
  8: tvm::transform::Pass::operator()(tvm::IRModule) const
  7: tvm::transform::Pass::operator()(tvm::IRModule, tvm::transform::PassContext const&) const
  6: tvm::transform::ModulePassNode::operator()(tvm::IRModule, tvm::transform::PassContext const&) const
  5: std::_Function_handler<void (tvm::runtime::TVMArgs, tvm::runtime::TVMRetValue*), tvm::runtime::TypedPackedFunc<tvm::IRModule (tvm::IRModule, tvm::transform::PassContext)>::AssignTypedLambda<tvm::relay::transform::InferType()::{lambda(tvm::IRModule, tvm::transform::PassContext const&)#1}>(tvm::relay::transform::InferType()::{lambda(tvm::IRModule, tvm::transform::PassContext const&)#1})::{lambda(tvm::runtime::TVMArgs const&, tvm::runtime::TVMRetValue*)#1}>::_M_invoke(std::_Any_data const&, tvm::runtime::TVMArgs&&, tvm::runtime::TVMRetValue*&&)
  4: tvm::relay::TypeInferencer::Infer(tvm::GlobalVar, tvm::relay::Function)
  3: tvm::relay::TypeSolver::Solve()
  2: std::_Function_handler<void (tvm::runtime::TVMArgs, tvm::runtime::TVMRetValue*), tvm::runtime::TypedPackedFunc<bool (tvm::runtime::Array<tvm::Type, void> const&, int, tvm::Attrs const&, tvm::TypeReporter const&)>::AssignTypedLambda<bool (*)(tvm::runtime::Array<tvm::Type, void> const&, int, tvm::Attrs const&, tvm::TypeReporter const&)>(bool (*)(tvm::runtime::Array<tvm::Type, void> const&, int, tvm::Attrs const&, tvm::TypeReporter const&))::{lambda(tvm::runtime::TVMArgs const&, tvm::runtime::TVMRetValue*)#1}>::_M_invoke(std::_Any_data const&, tvm::runtime::TVMArgs&&, tvm::runtime::TVMRetValue*&&)
  1: bool tvm::relay::ConcatenateRel<tvm::relay::ConcatenateAttrs>(tvm::runtime::Array<tvm::Type, void> const&, int, tvm::Attrs const&, tvm::TypeReporter const&)
  0: tvm::TensorType tvm::runtime::Downcast<tvm::TensorType, tvm::Type>(tvm::Type)
  File "/tvm-2021.8.13/tvm/src/relay/analysis/type_solver.cc", line 624
TVMError: 
---------------------------------------------------------------
An error occurred during the execution of TVM.
For more information, please see: https://tvm.apache.org/docs/errors.html
---------------------------------------------------------------
  Check failed: (false) is false: [10:08:17] /tvm-2021.8.13/tvm/include/tvm/runtime/object.h:886: 
---------------------------------------------------------------
An error occurred during the execution of TVM.
For more information, please see: https://tvm.apache.org/docs/errors.html
---------------------------------------------------------------
  Check failed: (ref->template IsInstance<typename SubRef::ContainerType>()) is false: Downcast from TypeCall to relay.TensorType failed.


Process finished with exit code 1

Are there any updates on this issue ? I am also facing something similar…