Create a `relay.Expr` from a `auto_scheduler.SearchTask`

I am working within the auto-scheduler, and am trying to build a SearchTask manually, rather than the standard approach of using a function decorator for a Relay function.

To begin with, I am trying to construct a new SearchTask from an old one, so I can understand the low level operations involved (and build some functionality). My ultimate goal is to be able to modify tasks.

Let’s say I have an existing SearchTask from ResNet50 for 'fused_nn.dense_add' called my_task.

I want to achieve this:

from tvm.auto_scheduler import SearchTask
from tvm.relay import Function

params = my_vars # think I know how to create these
body = None # unsure how to create this
args = None # assuming these are optional?
new_task = SearchTask(Function(params=params, body=body), args=args)

assert my_task.compute_dag.workload_key() == new_task.compute_dag.workload_key()

To create a new search task, I need a func: Function, and args: Union[Tuple[Any, ...], List[Any]].

To create the func, I need a Function, which is created with params: List[tvm.relay.Var] and a body: tvm.relay.Expr.

Given my_task, I can create the params with:

tensors = my_task.compute_dag.tensors

from tvm.relay.expr import var

my_vars = []
for _, t in ts.items():
    if t["name"] == 'placeholder':
        my_vars.append(var('placeholder', shape=t["shape"]))

However, to create the body: tvm.relay.Expr, I am unsure of the correct approach.

I can get the operation from the final tensor which computes the function, as well as several useful attributes with:

print(tensors[-1].op)
print(tensors[-1].op.body)
print(tensors[-1].op.axis)
print(tensors[-1].op.reduce_axis)
print(tensors[-1].op.num_outputs)
print(tensors[-1].op.input_tensors)
compute(T_add, body=[(T_dense[ax0, ax1] + placeholder[ax1])], axis=[iter_var(ax0, range(min=0, ext=1)), iter_var(ax1, range(min=0, ext=1000))], reduce_axis=[], tag=broadcast, attrs={})
[(T_dense[ax0, ax1] + placeholder[ax1])]
[iter_var(ax0, range(min=0, ext=1)), iter_var(ax1, range(min=0, ext=1000))]
[]
1
[Tensor(shape=[1, 1000], op.name=T_dense), Tensor(shape=[1000], op.name=placeholder)]

This information should be sufficient to construct a tvm.relay.Expr. However, I am unsure on how to do so. Expr is just an alias of RelayExpr, and neither of them seem to have a constructor __init__() function in Python.

Is this approach the best way to create a new task given an existing one? And if so, how do I create a tvm.relay.Expr given the information I have?

All of the code for this question is available at this gist.

This is impossible. The tensors you are referring to are already lowered from Relay to TE, so you cannot reverse them back to Relay.

Thanks for the reply @comaniac.

I don’t need these exact tensors, just to recreate them from the available properties (e.g. their shapes, expressions of how they are generated). I’m wondering if it’s possible to recreate them from scratch at the Relay level, just by reading the properties at the TE level.

E.g. for the placeholder tensors I store in my_vars, I make them from scratch, only copying the shape property from the lowered TE tensors to make new tensors at the Relay level.

Tthe final tensor, which is not a placeholder, is made from the expression T_dense[ax0, ax1] + placeholder[ax1]. I am not looking to convert tensors[-1].op to an tvm.relay.Expr. Instead, I am wondering if I can define a fresh expression that takes this information (e.g. the body, axis, etc) to make a new tvm.relay.Expr.

I’m not sure I understand your question. Relay expression and TE expression are at different levels. For example T_dense is a TE tensor compute, which corresponds to one Relay op (e.g., nn.dense). On the other hand, TE compute can also be a result of a sequence of Relay ops. For example T_dense[ax0, ax1] + placeholder[ax1] can be lowered from nn.bias_add(nn.dense(x, w), b).

In short, automatically generate nn.bias_add(nn.dense(x, w), b) from T_dense[ax0, ax1] + placeholder[ax1] seems impossible to me. You probably can only do it manually.