[torch] slice with expression range fails

slice works fine for fixed ranges (e.g. a[0:2]). If instead of particular number I use expression (e.g. a[0, L] then relay.build fails.

Do we support torch.slice only for fixed ranges?

The code to reproduce the issue:

import torch
import tvm
from tvm import relay
import numpy as np
        
class Net(torch.nn.Module):
    def __init__(self):
        super(Net, self).__init__()
    def forward(self, values, length):
        return values[0:length]
        

net = Net()
a = torch.tensor([[1, 2, 3], [4, 5, 6], [7, 8, 9]])
len = torch.tensor(2)
net(a, len)

traced_net = torch.jit.trace(net, (a, len))

ctx = tvm.cpu(0)
target = 'llvm'

shape_list = [("input0", [3,3]),("input1", []),]
mod, params = relay.frontend.from_pytorch(traced_net, shape_list)

with tvm.transform.PassContext(opt_level=3):
    lib = relay.build(mod, target=target, params=params)

func=mod['main']
intrp = relay.create_executor("graph", ctx=ctx, target=target)
ff=intrp.evaluate(func)
ff([[1, 2, 3], [4, 5, 6], [7, 8, 9]], 2)

Error:

>>> with tvm.transform.PassContext(opt_level=3):
...     lib = relay.build(mod, target=target, params=params)
... 
Traceback (most recent call last):
  File "<stdin>", line 2, in <module>
  File "/Users/pivovaa/workspace/tvm/python/tvm/relay/build_module.py", line 269, in build
    graph_json, mod, params = bld_mod.build(mod, target, target_host, params)
  File "/Users/pivovaa/workspace/tvm/python/tvm/relay/build_module.py", line 132, in build
    self._build(mod, target, target_host)
  File "/Users/pivovaa/workspace/tvm/python/tvm/_ffi/_ctypes/packed_func.py", line 237, in __call__
    raise get_last_ffi_error()
tvm._ffi.base.TVMError: Traceback (most recent call last):
  [bt] (8) 9   libtvm.dylib                        0x00000001268bdfa7 tvm::relay::StorageAllocaBaseVisitor::GetToken(tvm::RelayExpr const&) + 23
  [bt] (7) 8   libtvm.dylib                        0x0000000126967388 tvm::relay::ExprVisitor::VisitExpr(tvm::RelayExpr const&) + 344
  [bt] (6) 7   libtvm.dylib                        0x00000001266c2a8d tvm::relay::ExprFunctor<void (tvm::RelayExpr const&)>::VisitExpr(tvm::RelayExpr const&) + 173
  [bt] (5) 6   libtvm.dylib                        0x00000001266c2d80 tvm::NodeFunctor<void (tvm::runtime::ObjectRef const&, tvm::relay::ExprFunctor<void (tvm::RelayExpr const&)>*)>::operator()(tvm::runtime::ObjectRef const&, tvm::relay::ExprFunctor<void (tvm::RelayExpr const&)>*) const + 288
  [bt] (4) 5   libtvm.dylib                        0x00000001268bd3cb tvm::relay::StorageAllocator::VisitExpr_(tvm::relay::CallNode const*) + 491
  [bt] (3) 4   libtvm.dylib                        0x00000001268bdc92 tvm::relay::StorageAllocator::CreateToken(tvm::RelayExprNode const*, bool) + 1010
  [bt] (2) 3   libtvm.dylib                        0x00000001268bf29c tvm::relay::StorageAllocator::Request(tvm::relay::StorageToken*) + 28
  [bt] (1) 2   libtvm.dylib                        0x00000001268bfb73 tvm::relay::StorageAllocator::GetMemorySize(tvm::relay::StorageToken*) + 451
  [bt] (0) 1   libtvm.dylib                        0x0000000125a2c9ff dmlc::LogMessageFatal::~LogMessageFatal() + 111
  File "/Users/pivovaa/workspace/tvm/src/relay/backend/graph_plan_memory.cc", line 292
TVMError: 
---------------------------------------------------------------
An internal invariant was violated during the execution of TVM.
Please read TVM's error reporting guidelines.
More details can be found here: https://discuss.tvm.ai/t/error-reporting/7793.
---------------------------------------------------------------
  Check failed: pval != nullptr == false: Cannot allocate memory symbolic tensor shape [?, ?]

This is the error you get when you try to run dynamic models with graph runtime. Try VM compiler / runtime.

I updated tvm code and rebuilt it. Now I’m getting another error

>>> mod, params = relay.frontend.from_pytorch(traced_net, shape_list)
Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
  File "/Users/pivovaa/workspace/tvm/python/tvm/relay/frontend/pytorch.py", line 3194, in from_pytorch
    ret = converter.convert_operators(_get_operator_nodes(graph.nodes()), outputs, ret_name)[0]
  File "/Users/pivovaa/workspace/tvm/python/tvm/relay/frontend/pytorch.py", line 2615, in convert_operators
    relay_out = relay_op(
  File "/Users/pivovaa/workspace/tvm/python/tvm/relay/frontend/pytorch.py", line 403, in slice
    if target_begin == 0 and target_end >= index_size_limit and stride == 1:
  File "/Users/pivovaa/workspace/tvm/python/tvm/relay/expr.py", line 84, in __ge__
    raise TypeError('convert "%s" with `const` first' % str(other))
TypeError: convert "9223372036854775807" with `const` first

Can you try print the type of target_end?

inputs[3]: free_var %input1: int64;
%input1
inputs[3] type: <class 'tvm.relay.expr.Var'>
input_types[3]: 'int32'

target_end: free_var %input1: int64;
%input1

target_end type: <class 'tvm.relay.expr.Var'>

Related PR-7479

Can you open an issue with repro?