I was testing the dynamic nature of Relax via TVMScript, anc came across this situation: I wanted to create a tensor with a dynamic size. For simplicity, I tried using R.ones first (later wrote a TIR prim_func, but the same issue came up), where the size is a parameter variable, given at runtime:
from tvm.script import ir as I
from tvm.script import relax as R
@I.ir_module
class Module:
@R.function
def main (size: R.Prim("int64")):
with R.dataflow():
lv = R.ones((size,), "float32")
R.output(lv)
return lv
This raises an error:
error: TVMError: In function relax.ShapeExpr(0: Array<PrimExpr>, 1: Span) -> relax.expr.ShapeExpr: error while converting argument 0: [16:14:52] ~/tvm/include/tvm/runtime/packed_func.h:2056: InternalError: Check failed: (!checked_type.defined()) is false: Expected Array[PrimExpr], but got Array[index 0: relax.expr.Var]
--> ~/TE-test/dynamic_ones.py:9:18
|
9 | lv = R.ones((size,), "float32")
| ^^^^^^^^^^^^^^^^^^^^^^^^^^
However, I know that the problem is not with the dynamic nature of the call as I can do this:
from tvm.script import ir as I
from tvm.script import tir as T
from tvm.script import relax as R
@I.ir_module
class Module:
@R.function
def main (size: R.Prim("int64"), tensor: R.Tensor(("n",), "float32")):
n = T.int64()
with R.dataflow():
lv = R.ones((n,), "float32")
R.output(lv)
return lv
Which works correctly, and I can dynamically set the size of the generated tensor.
Because R.call_tir can handle R.Prim values as parameters and convert them to T.vars correctly, I’m guessing that there exists a lowering method for them? What would be a correct process of converting the R.Prim to a T.var manually?
As mentioned, I tried writing a TIR function that would copy R.ones, in the off chance the conversion would be handled there,
but R.call_tir requires the out_sinfo
parameter, where I can again use the n
variable, but not the size
This text will be hiddenparameter.