Hi, does anyone know the difference between @reg.register_strategy and reg.register_compute?

In my opinion, the two expression are for the link between the Relay Op and the TOPI. Did I understand wrongly?

Hey,

Take this with a grain of salt since I am not an official voice of the people who developed those things.

The compute should be to register the naive/non-optimize/non-hwdependent computation rule of a Relay operator. You can think about it as a golden reference.

Now, there can be many algorithmic implementations of this computation especially for different HW backends. These different implementations are normally called schedules and all schedules were gathered into TOPI.

At least to my understanding, a strategy is something somewhat more higher order than a schedule but its still describes and implementation variant of a compute. There could be moments when given the parameters of the operator and your HW, you can already decide on an algorithmic strategy and tune (or used an already optimized schedule) of the appropriate strategy.

If I had to give an example (warning I am not sure it’s like this in the repo):

  • Compute == the naive conv2d implementation
  • Strategy == {winograd, direct convolution, im2col transformation}
    • although all of these will give the same nummerical value, they have considerable difference in their implementations
  • Schedule == {winograd({target_0, target_1}), direct convolution({target_0,…,target_n-1}), im2col({target_3, target_n})}
    • as you can see not all targets have schedule (template) implementations for each of the different strategies

Hope this helps

1 Like

Thank you for your explanation, I still not fully understand and have some questions.

  1. As for the below example, the strategy means the topi and the schedule, did I understand wrong? In your opinion, topi includes schedules, I’m very confused about the concepts.

    @override_native_generic_func(“cumsum_strategy”)

    def cumsum_strategy(attrs, inputs, out_type, target):

     """cumsum generic strategy"""
     strategy = _op.OpStrategy()
     strategy.add_implementation(
         wrap_compute_scanop(topi.cumsum),
         wrap_topi_schedule(topi.generic.schedule_extern),
         name="cumsum.generic",
     )
     return strategy
    
  2. There is another example.

    @reg.register_compute(“image.resize2d”)

def compute_resize2d(attrs, inputs, out_type):

"""compute definition for resize2d op"""

size = attrs.size

layout = attrs.layout

method = attrs.method

coord_trans = attrs.coordinate_transformation_mode

rounding_method = attrs.rounding_method

cubic_alpha = attrs.cubic_alpha

cubic_exclude = attrs.cubic_exclude

out_dtype = attrs.out_dtype

return [

    topi.image.resize2d(

        inputs[0],

        size,

        layout,

        method,

        coord_trans,

        rounding_method,

        cubic_alpha,

        cubic_exclude,

        out_dtype,

    )

]

reg.register_injective_schedule(“image.resize2d”)

I think the two examples both arranges the topi and schedule. Please forgive me for my ignorance.

A strategy is a combination of compute + schedule which can be used to implement an operation efficiently. For example, for conv2d we register both a regular conv2d compute + schedule and a winograd conv2d compute + schedule as appropriate.

When autotuning, the list of strategies for an operation will then define our search space (e.g. we will search both winograd and regular conv2d implementations for conv2d).

I think the remaining confusion might be about the difference between compute and schedule. I am not an expert on this but they are similar to Halide compute and schedules. Take what I say as a grain of salt. The compute is a more declarative math expression which naively can be lowered into a set of for loops. The schedule is then a transformation of this naive set of for loops to add things like tiling, transposing axes, etc. It’s enumerations are what actually define the search space for that strategy in autotuning. However this schedule of course will only make sense for the compute it was designed for.

1 Like

Thanks@ AndrewZhaoLuo, that really helps.