Calling tvmc tune with android rpc causes restarting rpc server on the phone

Hello, I have build runtime for aarch64, and created tvm4j. Then I have installed tvm rpc on the phone. Then I have started rpc tracker on my local computer. I have started rpc app on the phone and put correct values. So the simple test to request rpc connection works ok.When I call tune command, my apk restarts screen from RpcActivity to MainActivity, and when I watch query_rpc, I that rpc connection are restarting. So the tuning can’t complete. In android logcat I can see following logs:


W/System.err: relaunching RPC activity...
W/System.err: updating preferences...
W/System.err: MainActivity onResume...
W/System.err: relaunching RPC activity...
W/System.err: updating preferences...
W/System.err: MainActivity onResume...
W/System.err: relaunching RPC activity...
W/System.err: updating preferences...
W/System.err: MainActivity onResume...
W/System.err: relaunching RPC activity...
W/System.err: updating preferences...
W/System.err: MainActivity onResume...
W/System.err: relaunching RPC activity...

my commad to tune:

TVM_LIBRARY_PATH=/home/piotr/projects/odai/tvm/tvm/build-sys-llvm12 python3 \
    -m tvm.driver.tvmc tune \
    --number 10 \
    --repeat 10 \
    --rpc-key android \
    --rpc-tracker 0.0.0.0:9190 \
    --output output_austin.log \
    --target llvm \
    --timeout 60 \
    --trials 47 \
    --enable-autoscheduler \
    --input-shapes "input_tensor:[1,288,512,3]" \
    --desired-layout NHWC \
    model_288_512.pb

I’m also get following stacktrace:

tvm/src/auto_scheduler/compute_dag.cc:1375: Warning: InferBound fails on the state:
Placeholder: placeholder, placeholder, placeholder
parallel n.0@oho.0@owo.0@oco.0@ohi.0@owi.0@oci.0@ (0,16)
  conv.local auto_unroll: 512
  for n_c.0 (None)
    for oho_c.0 (None)
      for owo_c.0 (None)
        for oco_c.0 (None)
          for ohi_c.0 (None)
            for owi_c.0 (None)
              for oci_c.0 (None)
                for n_c.1 (None)
                  for oho_c.1 (None)
                    for owo_c.1 (None)
                      for oco_c.1 (None)
                        for n (None)
                          for oho (None)
                            for owo (None)
                              for ohi (None)
                                for owi (None)
                                  vectorize ic (None)
                                    data_vec = ...
                        for ohi_c.1 (None)
                          for owi_c.1 (None)
                            for oci_c.1 (None)
                              for i0 (None)
                                for i1 (None)
                                  for i2 (None)
                                    vectorize i3 (None)
                                      PadInput = ...
                              for ic.0 (None)
                                for kh.0 (None)
                                  for kw.0 (None)
                                    for oco (None)
                                      for kh (None)
                                        for kw (None)
                                          vectorize ic@oci@ (None)
                                            kernel_vec = ...
                                    for n_c.2 (None)
                                      for oho_c.2 (None)
                                        for owo_c.2 (None)
                                          for oco_c.2 (None)
                                            for ohi_c.2 (None)
                                              for owi_c.2 (None)
                                                for oci_c.2 (None)
                                                  for ic.1 (None)
                                                    for kh.1 (None)
                                                      for kw.1 (None)
                                                        for n_c.3 (None)
                                                          for oho_c.3 (None)
                                                            for owo_c.3 (None)
                                                              for oco_c.3 (None)
                                                                for ohi_c.3 (None)
                                                                  for owi_c.3 (None)
                                                                    vectorize oci_c.3 (None)
                                                                      conv.local = ...
  for oho.1 (0,18)
    for owo.1 (0,2)
      for oco.1 (0,16)
        conv = ...
parallel ax0@ax1@ax2@ (0,576)
  vectorize ax3 (0,16)
    T_relu = ...

with: [13:04:16] /home/piotr/projects/odai/tvm/tvm/src/te/schedule/bound.cc:175: 
---------------------------------------------------------------
An error occurred during the execution of TVM.
For more information, please see: https://tvm.apache.org/docs/errors.html
---------------------------------------------------------------
  Check failed: (found_attach || stage_attach.size() == 0) is false: Invalid Schedule, cannot find the producer compute(PadInput, body=[tir.if_then_else(((((i1 >= 0) && (i1 < 36)) && (i2 >= 0)) && (i2 < 64)), placeholder[i0, i1, i2, i3], 0f)], axis=[iter_var(i0, range(min=0, ext=1)), iter_var(i1, range(min=0, ext=37)), iter_var(i2, range(min=0, ext=65)), iter_var(i3, range(min=0, ext=16))], reduce_axis=[], tag=injective,pad, attrs={}) along the loop nest specified by compute_at of consumer compute(data_vec, body=[PadInput[n, ((oho*2) + ohi), ((owo*2) + owi), ic]], axis=[iter_var(n, range(min=0, ext=1)), iter_var(oho, range(min=0, ext=18)), iter_var(owo, range(min=0, ext=32)), iter_var(ohi, range(min=0, ext=3)), iter_var(owi, range(min=0, ext=3)), iter_var(ic, range(min=0, ext=16))], reduce_axis=[], tag=, attrs={})
Stack trace:
  0: tvm::te::InferRootBound(tvm::te::Stage const&, tvm::te::GraphContext const&, std::unordered_map<tvm::tir::IterVar, tvm::Range, std::hash<tvm::tir::IterVar>, std::equal_to<tvm::tir::IterVar>, std::allocator<std::pair<tvm::tir::IterVar const, tvm::Range> > >*)
  1: tvm::te::InferBound(tvm::te::Schedule const&)
  2: tvm::auto_scheduler::ComputeDAG::InferBound(tvm::auto_scheduler::State const&) const
  3: tvm::auto_scheduler::ComputeDAG::InferBound(tvm::runtime::Array<tvm::auto_scheduler::State, void> const&) const::{lambda(int)#1}::operator()(int) const
  4: _ZNSt17_Function_handlerIFSt10unique_ptrINSt13__future_base12_Result_baseENS2_8_DeleterEEvENS1_12_Task_setterIS0_INS1_7_ResultIvEES3_EZNS1_11_Task_stateIZN3tvm7support12parallel_for
  5: std::__future_base::_State_baseV2::_M_do_set(std::function<std::unique_ptr<std::__future_base::_Result_base, std::__future_base::_Result_base::_Deleter> ()>*, bool*)
  6: __pthread_once_slow
        at /build/glibc-eX1tMB/glibc-2.31/nptl/pthread_once.c:116
  7: void std::call_once<void (std::__future_base::_State_baseV2::*)(std::function<std::unique_ptr<std::__future_base::_Result_base, std::__future_base::_Result_base::_Deleter> ()>*, bool*), std::__future_base::_State_baseV2*, std::function<std::unique_ptr<std::__future_base::_Result_base, std::__future_base::_Result_base::_Deleter> ()>*, bool*>(std::once_flag&, void (std::__future_base::_State_baseV2::*&&)(std::function<std::unique_ptr<std::__future_base::_Result_base, std::__future_base::_Result_base::_Deleter> ()>*, bool*), std::__future_base::_State_baseV2*&&, std::function<std::unique_ptr<std::__future_base::_Result_base, std::__future_base::_Result_base::_Deleter> ()>*&&, bool*&&)
  8: std::thread::_State_impl<std::thread::_Invoker<std::tuple<std::packaged_task<void (std::vector<int, std::allocator<int> > const&, std::function<void (int)> const&)>, std::vector<int, std::allocator<int> >, std::function<void (int)> > > >::_M_run()
  9: 0x00007feb1942ede3
  10: start_thread
        at /build/glibc-eX1tMB/glibc-2.31/nptl/pthread_create.c:477
  11: __clone
  12: 0xffffffffffffffff

But I think that this warning should be error: sWARNING:root:Could not find any valid schedule for task Task

When I turn off auto-scheduler, I can review log file:

Traceback (most recent call last):
  48: 0xffffffffffffffff
  47: _start
  46: __libc_start_main
  45: Py_BytesMain
  44: Py_RunMain
  43: 0x00000000006b6fa1
  42: PyObject_Call
  41: _PyFunction_Vectorcall
  40: _PyEval_EvalCodeWithName
  39: _PyEval_EvalFrameDefault
  38: _PyFunction_Vectorcall
  37: _PyEval_EvalCodeWithName
  36: _PyEval_EvalFrameDefault
  35: 0x00000000005c552f
  34: 0x0000000000600f53
  33: PyEval_EvalCode
  32: _PyEval_EvalCodeWithName
  31: _PyEval_EvalFrameDefault
  30: _PyFunction_Vectorcall
  29: _PyEval_EvalCodeWithName
  28: _PyEval_EvalFrameDefault
  27: PyObject_Call
  26: _PyFunction_Vectorcall
  25: _PyEval_EvalCodeWithName
  24: _PyEval_EvalFrameDefault
  23: 0x000000000050ad7b
  22: _PyFunction_Vectorcall
  21: _PyEval_EvalFrameDefault
  20: 0x00000000005c552f
  19: 0x00000000004f5162
  18: 0x0000000000500992
  17: _PyEval_EvalFrameDefault
  16: _PyFunction_Vectorcall
  15: _PyEval_EvalFrameDefault
  14: _PyFunction_Vectorcall
  13: _PyEval_EvalFrameDefault
  12: _PyFunction
~                       

My model contains image.resize2d, which produces following warning before autotuning:

[14:23:26] /home/piotr/projects/odai/tvm/tvm/src/relay/transforms/convert_layout.cc:99: Warning: Desired layout(s) not specified for op: image.resize2d
[14:23:26] /home/piotr/projects/odai/tvm/tvm/src/relay/transforms/convert_layout.cc:99: Warning: Desired layout(s) not specified for op: image.resize2d
[14:23:26] /home/piotr/projects/odai/tvm/tvm/src/relay/transforms/convert_layout.cc:99: Warning: Desired layout(s) not specified for op: image.resize2d
[14:23:26] /home/piotr/projects/odai/tvm/tvm/src/relay/transforms/convert_layout.cc:99: Warning: Desired layout(s) not specified for op: image.resize2d
[14:23:26] /home/piotr/projects/odai/tvm/tvm/src/relay/transforms/convert_layout.cc:99: Warning: Desired layout(s) not specified for op: image.resize2d
[14:23:26] /home/piotr/projects/odai/tvm/tvm/src/relay/transforms/convert_layout.cc:99: Warning: Desired layout(s) not specified for op: image.resize2d
[14:23:26] /home/piotr/projects/odai/tvm/tvm/src/relay/transforms/convert_layout.cc:99: Warning: Desired layout(s) not specified for op: image.resize2d
[14:23:26] /home/piotr/projects/odai/tvm/tvm/src/relay/transforms/convert_layout.cc:99: Warning: Desired layout(s) not specified for op: image.resize2d
47

I think that this is main problem