Incorrect generated function after PartitionGraph pass

Hi @zhiics @comaniac,

I am using BYOC to offload transformers to external codegen tools. These transformers are composite functions. I had been using this feature well with my manually-generated annotation passes, but when I merge the latest changes to go through the AnnotateGraph -> PartitionGraph passes, I found that codegen fails because the generated function is wrong.

The transformer outputs a single value, and this value is used in three places in the model. However, the generated function returns this value as a 3-tuple:

    ...
    add(%268, %output_layernorm_bias2) /* ty=Tensor[(1, 64, 512), float32] */
  };
  %270 = %269(meta[relay.Constant][32] /* ty=Tensor[(512), float32] */ ...;
  (%270, %270, %270)
}

The return value should just be %270.

After checking the output of AnnotateTarget, I found that the issue is that a new CompilerEnd annotation is added each time this output is used. For example:

%395 = annotation.compiler_end(%394, meta[relay.attrs.CompilerAttrs][105])
...
%444 = annotation.compiler_end(%394, meta[relay.attrs.CompilerAttrs][140])
...
%475 = annotation.compiler_end(%394, meta[relay.attrs.CompilerAttrs][162])

This definitely seems like a bug, and is causing my codegen to break since the body of the function is a tuple rather than a call node. Is there a good workaround, or easy way to fix this?

Thanks!

This should be resolved by this PR: [BYOC] Prevent duplicate outputs in subgraph Tuple by trevor-m · Pull Request #5320 · apache/tvm · GitHub :slight_smile:

Wow, perfect timing! Thanks :slight_smile: