Cannot export mlf format file when whole graph is partitioned to another compiler

Hi, I constructed a simple example like that

def test_export_mlf():
    x = relay.var("x", shape=(10, 10))
    w0 = relay.var("w0", shape=(10, 10))
    w1 = relay.var("w1", shape=(10, 10))
    w2 = relay.var("w2", shape=(10, 10))

    z0 = relay.add(x, w0)
    p0 = relay.subtract(z0, w1)
    q0 = relay.multiply(p0, w2)

    f = relay.Function([x, w0, w1, w2], q0)
    mod = tvm.IRModule()
    ann = byoc.CcompilerAnnotator()
    mod["main"] = ann.visit(f)
    mod = tvm.relay.transform.PartitionGraph()(mod)
    mod = tvm.relay.transform.InferType()(mod)

    rt_mod = relay.build(
        mod, target="c", runtime=Runtime("crt", {"system-lib": True}))
    tvm.micro.export_model_library_format(rt_mod, "./module.tar")

which IRMoulde is like:

def @main(%x: Tensor[(10, 10), float32], %w0: Tensor[(10, 10), float32], %w1: Tensor[(10, 10), float32], %w2: Tensor[(10, 10), float32]) -> Tensor[(10, 10), float32] {
@tvmgen_default_ccompiler_main_0(%x, %w0, %w1, %w2) /* ty=Tensor[(10, 10), float32] */
}

def @tvmgen_default_ccompiler_main_0(%ccompiler_0_i0: Tensor[(10, 10), float32], %ccompiler_0_i1: Tensor[(10, 10), float32], %ccompiler_0_i2: Tensor[(10, 10), float32], %ccompiler_0_i3: Tensor[(10, 10), float32], Inline=1, global_symbol="tvmgen_default_ccompiler_main_0", Compiler="ccompiler", Primitive=1) -> Tensor[(10, 10), float32] {
%0 = add(%ccompiler_0_i0, %ccompiler_0_i1) /* ty=Tensor[(10, 10), float32] */;
%1 = subtract(%0, %ccompiler_0_i2) /* ty=Tensor[(10, 10), float32] */;
multiply(%1, %ccompiler_0_i3) /* ty=Tensor[(10, 10), float32] */
}

In this example, there will be an error when exporting the module.tar

assert dso_mod.format in [“c”, “cc”, “cpp”]

But when I change the target from “c” to “llvm”, everything works fine. And I checked code, it seems that when lowered func is empty, just create an empty CSourceModule without format

ret_.mod = tvm::codegen::CSourceModuleCreate(";", “”, Array{});

I wonder if this is a bug or for some reason?

Thank you.