Pytorch model fails to compile when tvm is imported

Hi, I want to compile a pytorch model in tvm. I am following this documentation:

Compile PyTorch Models — tvm 0.12.dev0 documentation (apache.org)

Here is how the pretrained model in this doc is loaded and scriptedmodel is created:

import tvm
from tvm import relay
import numpy as np
from tvm.contrib.download import download_testdata

# PyTorch imports
import torch
import torchvision
model_name = "resnet18"
model = getattr(torchvision.models, model_name)(pretrained=True)
model = model.eval()
# We grab the TorchScripted model via tracing
input_shape = [1, 3, 224, 224]
input_data = torch.randn(input_shape)
scripted_model = torch.jit.trace(model, input_data).eval()

For me, the code fails at this line:
scripted_model = torch.jit.trace(model, input_data).eval()

With this error:

libc++abi: terminating with uncaught exception of type pybind11::stop_iteration:
Aborted

The error seems to be cause by tvm because when I remove all tvm related imports from this code, the scripted_model is generated with no error.

I face the same problem when trying to run tests inside the tests/python/frontend/pytorch/ folder. Here is an example:

tests/python/frontend/pytorch/qnn_test.py::test_quantized_modules libc++abi: terminating with uncaught exception of type pybind11::stop_iteration:
Fatal Python error: Aborted

Current thread 0x00007fb0056fc3c0 (most recent call first):
  File "/local/workspace/fateme/venv_torch_tvm/lib/python3.8/site-packages/torch/jit/_trace.py", line 417 in graph_diagnostic_info
  File "/local/workspace/fateme/venv_torch_tvm/lib/python3.8/site-packages/torch/jit/_trace.py", line 560 in _check_trace
  File "/local/workspace/fateme/venv_torch_tvm/lib/python3.8/site-packages/torch/utils/_contextlib.py", line 115 in decorate_context
  File "/local/workspace/fateme/venv_torch_tvm/lib/python3.8/site-packages/torch/jit/_trace.py", line 1084 in trace_module
  File "/local/workspace/fateme/venv_torch_tvm/lib/python3.8/site-packages/torch/jit/_trace.py", line 794 in trace
  File "/local/workspace/fateme/Projects/tvm/tests/python/frontend/pytorch/qnn_test.py", line 330 in test_quantized_modules
  File "/local/workspace/fateme/venv_torch_tvm/lib/python3.8/site-packages/_pytest/python.py", line 195 in pytest_pyfunc_call
  File "/local/workspace/fateme/venv_torch_tvm/lib/python3.8/site-packages/pluggy/_callers.py", line 39 in _multicall
  File "/local/workspace/fateme/venv_torch_tvm/lib/python3.8/site-packages/pluggy/_manager.py", line 80 in _hookexec
  File "/local/workspace/fateme/venv_torch_tvm/lib/python3.8/site-packages/pluggy/_hooks.py", line 265 in __call__
  File "/local/workspace/fateme/venv_torch_tvm/lib/python3.8/site-packages/_pytest/python.py", line 1789 in runtest
  File "/local/workspace/fateme/venv_torch_tvm/lib/python3.8/site-packages/_pytest/runner.py", line 167 in pytest_runtest_call
  File "/local/workspace/fateme/venv_torch_tvm/lib/python3.8/site-packages/pluggy/_callers.py", line 39 in _multicall
  File "/local/workspace/fateme/venv_torch_tvm/lib/python3.8/site-packages/pluggy/_manager.py", line 80 in _hookexec
  File "/local/workspace/fateme/venv_torch_tvm/lib/python3.8/site-packages/pluggy/_hooks.py", line 265 in __call__
  File "/local/workspace/fateme/venv_torch_tvm/lib/python3.8/site-packages/_pytest/runner.py", line 260 in <lambda>
  File "/local/workspace/fateme/venv_torch_tvm/lib/python3.8/site-packages/_pytest/runner.py", line 339 in from_call
  File "/local/workspace/fateme/venv_torch_tvm/lib/python3.8/site-packages/_pytest/runner.py", line 259 in call_runtest_hook
  File "/local/workspace/fateme/venv_torch_tvm/lib/python3.8/site-packages/_pytest/runner.py", line 220 in call_and_report
  File "/local/workspace/fateme/venv_torch_tvm/lib/python3.8/site-packages/_pytest/runner.py", line 131 in runtestprotocol
  File "/local/workspace/fateme/venv_torch_tvm/lib/python3.8/site-packages/_pytest/runner.py", line 112 in pytest_runtest_protocol
  File "/local/workspace/fateme/venv_torch_tvm/lib/python3.8/site-packages/pluggy/_callers.py", line 39 in _multicall
  File "/local/workspace/fateme/venv_torch_tvm/lib/python3.8/site-packages/pluggy/_manager.py", line 80 in _hookexec
  File "/local/workspace/fateme/venv_torch_tvm/lib/python3.8/site-packages/pluggy/_hooks.py", line 265 in __call__
  File "/local/workspace/fateme/venv_torch_tvm/lib/python3.8/site-packages/_pytest/main.py", line 349 in pytest_runtestloop
  File "/local/workspace/fateme/venv_torch_tvm/lib/python3.8/site-packages/pluggy/_callers.py", line 39 in _multicall
  File "/local/workspace/fateme/venv_torch_tvm/lib/python3.8/site-packages/pluggy/_manager.py", line 80 in _hookexec
  File "/local/workspace/fateme/venv_torch_tvm/lib/python3.8/site-packages/pluggy/_hooks.py", line 265 in __call__
  File "/local/workspace/fateme/venv_torch_tvm/lib/python3.8/site-packages/_pytest/main.py", line 324 in _main
  File "/local/workspace/fateme/venv_torch_tvm/lib/python3.8/site-packages/_pytest/main.py", line 270 in wrap_session
  File "/local/workspace/fateme/venv_torch_tvm/lib/python3.8/site-packages/_pytest/main.py", line 317 in pytest_cmdline_main
  File "/local/workspace/fateme/venv_torch_tvm/lib/python3.8/site-packages/pluggy/_callers.py", line 39 in _multicall
  File "/local/workspace/fateme/venv_torch_tvm/lib/python3.8/site-packages/pluggy/_manager.py", line 80 in _hookexec
  File "/local/workspace/fateme/venv_torch_tvm/lib/python3.8/site-packages/pluggy/_hooks.py", line 265 in __call__
  File "/local/workspace/fateme/venv_torch_tvm/lib/python3.8/site-packages/_pytest/config/__init__.py", line 167 in main
  File "/local/workspace/fateme/venv_torch_tvm/lib/python3.8/site-packages/_pytest/config/__init__.py", line 190 in console_main
  File "/local/workspace/fateme/venv_torch_tvm/lib/python3.8/site-packages/pytest/__main__.py", line 5 in <module>
  File "/usr/local/lib/python3.8/runpy.py", line 87 in _run_code
  File "/usr/local/lib/python3.8/runpy.py", line 194 in _run_module_as_main

I’ve tried installing torch 1.7 and 2.0 (using pip and also building from scratch from the source code) but the error persists.

I’d appreciate any help with this issue,

–Fateme