Hello!
After installing TVM from source, as described in the tutorial on the site, I’m getting an error when trying to load ONNX model:
mod, params = relax.frontend.onnx.from_onnx(onnx_model, shape_dict)
AttributeError: module 'tvm.relax.frontend' has no attribute 'onnx'
What could I have missed? Can the ONNX support in relax.frontend being turned off or on? There’s nothing in about ONNX the config.cmake, in contrast to USE_TFLITE option.
The tvm.__version__
shows 0.21.dev0
, the OS is Ubuntu 24.04.2 LTS
import onnx
from tvm.relax.frontend.onnx import from_onnx
onnx_model = onnx.load(onnx_path)
mod = from_onnx(onnx_model, shape_dict=shape_dict) # type: ignore
This will work
I’ve opened source code at python/tvm/relax/frontend/__init__.py and saw there instructions:
"""Frontends for constructing Relax programs, with the model importers"""
from . import nn <br>
from .common import detach_params
So, I’ve added a similar line
from . import onnx
After that the call
mod = relax.frontend.onnx.from_onnx(onnx_model, shape_dict)
returned successfully.
Is it as expected? The main branch on Github also contains no onnx import in the __init__.py file:
https://github.com/apache/tvm/blob/main/python/tvm/relax/frontend/__init__.py
That’s not so different, it would just throw “module ‘tvm.relax.frontend’ has no attribute ‘onnx’” on the line “from tvm.relax.frontend.onnx import from_onnx” . The fix that I described below helped me, though at this moment I’ve commented the line in __init__.py and frontend.onnx is importing OK. I have also 0.18.0 version on the same machine, in separate conda env, but installed the same way from source. May be they interfere each other somehow, I do not know.
Great. It does work either when importing as “from tvm.relax.frontend.onnx import from_onnx”, even if latter I do “from tvm import relax” and “relax.frontend.onnx.from_onnx(onnx_model, shape_dict)” OR if putting “from . import onnx” in python/tvm/relax/frontend/__init.py__ .
Look like there are some troubles with onnx module importing. Shouldn’t I report this in Github issues?