One possibility to support these is to enhance relay.Tuple. relay.Tuple is dynamic, but it lacks ability to update items in it and it’s size can’t be get in runtime.
type dynamic_tensor =
Tensor0 of TensorType(shape=())
| Tensor1 of TensorType(shape=(Any))
| Tensor2 of TensorType(shape=(Any, Any))
| Tensor3 of TensorType(shape=(Any, Any, Any))
| Tensor4 of TensorType(shape=(Any, Any, Any, Any))
| Tensor5 of TensorType(shape=(Any, Any, Any, Any, Any))
| Tensor6 of TensorType(shape=(Any, Any, Any, Any, Any, Any))
type tensor_array = dynamic_tensor list
We define an data type dynamic_tensor that supports tensors up to 6(we can grow the rank of cause but might not be necessary). Then tensor array is just a dynamic_tensor list.
Then we can implement TensorArray ops as relay functions. Most of them are trivial to implement. Some are tricky( but I think doable with expand_dims):
@ydy Any is not complete yet. Right now we are able to represent model with dynamic shape in relay. We still need to finish the codegen and runtime change in order to execute the model.