Tutorial errors

I want to run the toturial of " Deploy the Pretrained Model on Raspberry Pi",but when I ran the code:

local_demo = False

if local_demo:
    target = tvm.target.Target("llvm")
else:
    target = tvm.target.arm_cpu("rasp3b")

with tvm.transform.PassContext(opt_level=3):
    lib = relay.build(func, target, params=params)

tmp = util.tempdir()
lib_fname = tmp.relpath("net.tar")
lib.export_library(lib_fname)

I got this:

> "-target" is deprecated, use "-mtriple" instead.
> "-target" is deprecated, use "-mtriple" instead.
> "-target" is deprecated, use "-mtriple" instead.
> "-target" is deprecated, use "-mtriple" instead.
> "-target" is deprecated, use "-mtriple" instead.
> "-target" is deprecated, use "-mtriple" instead.
> Cannot find config for target=llvm -keys=arm_cpu,cpu -device=arm_cpu -mattr=+neon -model=bcm2837 -mtriple=armv7l-linux-gnueabihf, workload=('dense_nopack.x86', ('TENSOR', (1, 512), 'float32'), ('TENSOR', (1000, 512), 'float32'), None, 'float32'). A fallback configuration is used, which may bring great performance regression.

Could someone help me,thaks very much!

This is not a problem. TVM has done it automatically (replace target to mtriple) for u.

Thank for replying me. But I have ran the toturial(https://tvm.apache.org/docs/tutorials/frontend/deploy_model_on_rasp.html)step by step. I changed the version of tvm and llvm, the errors below still exist:

RPCError: Traceback (most recent call last):
  [bt] (8) /home/ljs/tvm/build/libtvm.so(std::_Function_handler<void (tvm::runtime::TVMArgs, tvm::runtime::TVMRetValue*), tvm::runtime::RPCModuleNode::WrapRemoteFunc(void*)::{lambda(tvm::runtime::TVMArgs, tvm::runtime::TVMRetValue*)#1}>::_M_invoke(std::_Any_data const&, tvm::runtime::TVMArgs&&, tvm::runtime::TVMRetValue*&&)+0x33) [0x7ff0e9e88b43]
  [bt] (7) /home/ljs/tvm/build/libtvm.so(tvm::runtime::RPCWrappedFunc::operator()(tvm::runtime::TVMArgs, tvm::runtime::TVMRetValue*) const+0x3c5) [0x7ff0e9e885a5]
  [bt] (6) /home/ljs/tvm/build/libtvm.so(tvm::runtime::RPCClientSession::CallFunc(void*, TVMValue const*, int const*, int, std::function<void (tvm::runtime::TVMArgs)> const&)+0x57) [0x7ff0e9e7c397]
  [bt] (5) /home/ljs/tvm/build/libtvm.so(tvm::runtime::RPCEndpoint::CallFunc(void*, TVMValue const*, int const*, int, std::function<void (tvm::runtime::TVMArgs)>)+0x215) [0x7ff0e9e73dd5]
  [bt] (4) /home/ljs/tvm/build/libtvm.so(tvm::runtime::RPCEndpoint::HandleUntilReturnEvent(bool, std::function<void (tvm::runtime::TVMArgs)>)+0x1ab) [0x7ff0e9e72d8b]
  [bt] (3) /home/ljs/tvm/build/libtvm.so(tvm::runtime::RPCEndpoint::EventHandler::HandleNextEvent(bool, bool, std::function<void (tvm::runtime::TVMArgs)>)+0xd7) [0x7ff0e9e7c0a7]
  [bt] (2) /home/ljs/tvm/build/libtvm.so(tvm::runtime::RPCEndpoint::EventHandler::HandleProcessPacket(std::function<void (tvm::runtime::TVMArgs)>)+0x126) [0x7ff0e9e7be86]
  [bt] (1) /home/ljs/tvm/build/libtvm.so(tvm::runtime::RPCEndpoint::EventHandler::HandleReturn(tvm::runtime::RPCCode, std::function<void (tvm::runtime::TVMArgs)>)+0x13f) [0x7ff0e9e7b3af]
  [bt] (0) /home/ljs/tvm/build/libtvm.so(+0x17ab042) [0x7ff0e9e71042]
  [bt] (8) /home/ubuntu/tvm/build/libtvm.so(+0x103b100) [0xffff9d1bc100]
  [bt] (7) /home/ubuntu/tvm/build/libtvm.so(tvm::runtime::RPCServerLoop(int)+0xac) [0xffff9d1bb7b4]
  [bt] (6) /home/ubuntu/tvm/build/libtvm.so(tvm::runtime::RPCEndpoint::ServerLoop()+0xe8) [0xffff9d19f0c8]
  [bt] (5) /home/ubuntu/tvm/build/libtvm.so(tvm::runtime::RPCEndpoint::HandleUntilReturnEvent(bool, std::function<void (tvm::runtime::TVMArgs)>)+0x258) [0xffff9d19eb40]
  [bt] (4) /home/ubuntu/tvm/build/libtvm.so(tvm::runtime::RPCEndpoint::EventHandler::HandleNextEvent(bool, bool, std::function<void (tvm::runtime::TVMArgs)>)+0x1e4) [0xffff9d1a7184]
  [bt] (3) /home/ubuntu/tvm/build/libtvm.so(tvm::runtime::RPCEndpoint::EventHandler::HandleProcessPacket(std::function<void (tvm::runtime::TVMArgs)>)+0x168) [0xffff9d1a6ea0]
  [bt] (2) /home/ubuntu/tvm/build/libtvm.so(tvm::runtime::RPCSession::AsyncCallFunc(void*, TVMValue const*, int const*, int, std::function<void (tvm::runtime::RPCCode, tvm::runtime::TVMArgs)>)+0x58) [0xffff9d1ba368]
  [bt] (1) /home/ubuntu/tvm/build/libtvm.so(tvm::runtime::LocalSession::CallFunc(void*, TVMValue const*, int const*, int, std::function<void (tvm::runtime::TVMArgs)> const&)+0x74) [0xffff9d1ad9c4]
  [bt] (0) /home/ubuntu/tvm/build/libtvm.so(+0xfbf140) [0xffff9d140140]
  File "/home/ubuntu/tvm/python/tvm/_ffi/_ctypes/packed_func.py", line 81, in cfun
    rv = local_pyfunc(*pyargs)
  File "/home/ubuntu/tvm/python/tvm/rpc/server.py", line 69, in load_module
    m = _load_module(path)
  File "/home/ubuntu/tvm/python/tvm/runtime/module.py", line 411, in load_module
    _cc.create_shared(path + ".so", files)
  File "/home/ubuntu/tvm/python/tvm/contrib/cc.py", line 43, in create_shared
    _linux_compile(output, objects, options, cc, compile_shared=True)
  File "/home/ubuntu/tvm/python/tvm/contrib/cc.py", line 205, in _linux_compile
    raise RuntimeError(msg)
  File "/home/ljs/tvm/src/runtime/rpc/rpc_endpoint.cc", line 370
RPCError: Error caught from RPC call:
RuntimeError: Compilation error:
/tmp/tmpvh_4_v0b/net/lib0.o: error adding symbols: File in wrong format
collect2: error: ld returned 1 exit status

Command line: g++ -shared -fPIC -o /tmp/tmpvh_4_v0b/net.tar.so /tmp/tmpvh_4_v0b/net/lib0.o /tmp/tmpvh_4_v0b/net/devc.o

I have replied on your issue before. This is because of your created target and compiler doesn’t match. If you use TVM compile it for arm, but you use g++ (on your host x86 machine) or you use TVM compile it for x86, but you use g++ (remote arm machine), you will meet this kind of error.

1 Like

Thanks! I’m new to ai so it’s difficult for me. So to run the tutorial, do I need to re-make the tvm, modify the config.cmake. Set the “USE_ARM_COMPUTE_LIB” and “USE_ARM_COMPUTE_LIB_GRAPH_RUNTIME” on.

Ignore this reply:

No, these two options are not for your case. For your case:

tmp = util.tempdir()
lib_fname = tmp.relpath("net.tar")
# lib.export_library(lib_fname) # remove it
1 Like

I’m sorry to bother you again. I removed the line then got a new error. I tried to debug it by myself but losed. If you are in spare time, please help me.

Sorry, my previous reply is wrong. Please add that line back. Even it is tar, we still need to export_library. Would you mind pasting all your code? There is something you have done wrong.

Another thing is I want you to tell me:

  1. /home/ubuntu/tvm/build/ —> This is arm machine or x86 machine?
  2. /home/ljs/tvm/build/ —> This is arm machine or x86 machine?

Thank you very much for your reply.

/home/ubuntu/tvm/build/ --> is my arm machine(raspberry 3b) . 
 I have built  ubuntu18.04.5 64bit and tvm0.7 on it.

/home/ljs/tvm/build/ -->is my x64 machine.
(cpu is AMD® Ryzen 5 4600h with radeon graphics × 12 )    
I have built LLVM12.0.0 and tvm0.7 on it.

My code is from the tvm tutorials(https://tvm.apache.org/docs/tutorials/frontend/deploy_model_on_rasp.html):

%matplotlib inline

import tvm
from tvm import te
import tvm.relay as relay
from tvm import rpc
from tvm.contrib import util, graph_runtime as runtime
from tvm.contrib.download import download_testdata

from mxnet.gluon.model_zoo.vision import get_model
from PIL import Image
import numpy as np
# one line to get the model
block = get_model("resnet18_v1", pretrained=True)

img_url = "https://github.com/dmlc/mxnet.js/blob/master/data/cat.png?raw=true"
img_name = "cat.png"
img_path = download_testdata(img_url, img_name, module="data")
image = Image.open(img_path).resize((224, 224))

def transform_image(image):
    image = np.array(image) - np.array([123.0, 117.0, 104.0])
    image /= np.array([58.395, 57.12, 57.375])
    image = image.transpose((2, 0, 1))
    image = image[np.newaxis, :]
    return image
x = transform_image(image)

synset_url = "".join(
    [
        "https://gist.githubusercontent.com/zhreshold/",
        "4d0b62f3d01426887599d4f7ede23ee5/raw/",
        "596b27d23537e5a1b5751d2b0481ef172f58b539/",
        "imagenet1000_clsid_to_human.txt",
    ]
)
synset_name = "imagenet1000_clsid_to_human.txt"
synset_path = download_testdata(synset_url, synset_name, module="data")
with open(synset_path) as f:
    synset = eval(f.read())

shape_dict = {"data": x.shape}
mod, params = relay.frontend.from_mxnet(block, shape_dict)
# we want a probability so add a softmax operator
func = mod["main"]
func = relay.Function(func.params, relay.nn.softmax(func.body), None, func.type_params, func.attrs)

batch_size = 1
num_classes = 1000
image_shape = (3, 224, 224)
data_shape = (batch_size,) + image_shape

local_demo = False

if local_demo:
    target = tvm.target.create('llvm')
else:
    target = tvm.target.arm_cpu('rasp3b')
    # The above line is a simple form of
    # target = tvm.target.create('llvm -device=arm_cpu -model=bcm2837 -mtriple=armv7l-linux-gnueabihf -mattr=+neon')

with tvm.transform.PassContext(opt_level=3):
    lib = relay.build(func, target, params=params)

# After `relay.build`, you will get three return values: graph,
# library and the new parameter, since we do some optimization that will
# change the parameters but keep the result of model as the same.

# Save the library at local temporary directory.
tmp = util.tempdir()
lib_fname = tmp.relpath('net.tar')
lib.export_library(lib_fname)


# obtain an RPC session from remote device.
if local_demo:
    remote = rpc.LocalSession()
else:
    # The following is my environment, change this to the IP address of your target device
    host = '192.168.43.166'
    port = 9090
    remote = rpc.connect(host, port)

# upload the library to remote device and load it
remote.upload(lib_fname)
rlib = remote.load_module('.tar')

# create the remote runtime module
ctx = remote.cpu(0)
module = runtime.GraphModule(rlib['default'](ctx))
# set input data
module.set_input('data', tvm.nd.array(x.astype('float32')))
# run
module.run()
# get output
out = module.get_output(0)
# get top1 result
top1 = np.argmax(out.asnumpy())
print('TVM prediction top-1: {}'.format(synset[top1]))

OK. Thanks for your information. Please change this line:

target = tvm.target.arm_cpu('rasp3b')

--->
target = tvm.target.create('llvm -device=arm_cpu -model=bcm2837 -mtriple=aarch64-linux-gnu -mattr=+neon -mcpu=cortex-a53')

You will pass.

The issue is tvm assumes rasp 3b run 32 bits os.

1 Like

Thank you so much!!! I ran the tutorial successfully using your advice!!! :blush: :blush: