[Android Deploy] Error: "Binary was created using {GraphExecutorFactory} but a loader of that name is not registered Available loaders are ." when running custom model on TVM v0.11.0 (Android CPU)

Summary

I’m trying to deploy a custom model (deploy_lib.so, .json, .params) using the Android Deploy app on TVM v0.11.0, targeting ARM64 CPU.

Running the provided model demo works fine, but when I replace it with my own Relay-compiled PyTorch model, I get the following error:

TVMError: Binary was created using {GraphExecutorFactory} but a loader of that name is not registered. Available loaders are .

Context

I understand that android_deploy is deprecated on the main branch, but I’m working with v0.11.0 because:

Relay and MXNet are still available (not yet replaced by Relax).

My setup already runs fine through Android RPC (verified with PyTorch and MXNet models).

To avoid submodule mismatches, I downloaded the source tarball directly from tvm.apache.org/download

Steps That Work

The example Darknet model runs successfully in the Android app.

PyTorch and MXNet models exported via Relay work correctly when executed through Android RPC on a Samsung Galaxy A56 (Android 16).

Steps to Reproduce

1. Model Export


import torch

import tvm

from tvm import relay

from tvm.contrib import ndk

# 1. Load pretrained model

model = torch.hub.load("pytorch/vision:v0.10.0", "mobilenet_v2", pretrained=True).eval()

# 2. Trace model

input_shape = [1, 3, 224, 224]

dummy = torch.randn(input_shape)

scripted = torch.jit.trace(model, dummy).eval()

# 3. Convert to Relay

mod, params = relay.frontend.from_pytorch(scripted, [("input0", tuple(input_shape))])

# 4. Build for ARM64 Android

target = "llvm -mtriple=aarch64-linux-android"

with tvm.transform.PassContext(opt_level=3):

lib = relay.build(mod, target=target, params=params)

# 5. Export artifacts

lib.export_library("deploy_lib.so", fcompile=ndk.create_shared)

with open("deploy_graph.json", "w") as f:

f.write(lib.get_graph_json())

with open("deploy_param.params", "wb") as f:

f.write(relay.save_param_dict(lib.get_params()))

2. CMake Build for Runtime


cmake ../ \

-DCMAKE_TOOLCHAIN_FILE=$ANDROID_NDK/build/cmake/android.toolchain.cmake \

-DCMAKE_BUILD_TYPE=Release \

-DANDROID_ABI="arm64-v8a" \

-DUSE_CPP_RPC=ON \

-DUSE_GRAPH_EXECUTOR=ON \

-DANDROID_NATIVE_API_LEVEL=android-28 \

-DANDROID_TOOLCHAIN=clang++ \

-DANDROID_STL=c++_static

make tvm_runtime tvm_rpc -j4

3. The error code in android app


// tvm module for compiled functions

Module modelLib = Module.load(libCacheFilePath);

The exact error is

Binary was created using {GraphExecutorFactory} but a loader of that name is not registered Available loaders are .

What I have tried

Confirmed the model works via RPC (so export pipeline is valid).

Rebuilt libtvm_runtime.so with USE_GRAPH_EXECUTOR=ON.

Verified file paths and assets in android_deploy app.

Checked known issues e.g., this thread

I ended up solving the problem myself, I checked some commits in TVM Github and I saw a change where you need to add #include "../src/runtime/graph_executor/graph_executor_factory.cc" And this problem will disappear

tvm_runtime.h commit