import tvm
array=tvm.placeholder((10, ), name='array')
A=tvm.placeholder((10, ), name='A')
def fcompute(i):
r=tvm.reduce_axis((0, array[i]), name='r')
return tvm.sum(A[r], axis=[r])
ss=tvm.compute((10, ), fcompute, name='s')
s=tvm.create_schedule(ss.op)
AS=s.cache_read(array, 'shared', [ss])
tvm.lower(s, [A, array, ss], simple_mode=True)
Here the reduce axis contains a variable, and we want to use a shared tensor to cache it.
(Though I didn’t write schedule for shared tensor, but that’s not the point)
TVM fails to build it with the following:
Traceback (most recent call last):
File "reduce_test.py", line 11, in <module>
tvm.lower(s, [A, array, ss], simple_mode=True)
File "/home/kirlia/tvm/python/tvm/build_module.py", line 362, in lower
stmt = form_body(sch)
File "/home/kirlia/tvm/python/tvm/build_module.py", line 308, in form_body
bounds = schedule.InferBound(sch)
File "/home/kirlia/tvm/python/tvm/_ffi/_ctypes/function.py", line 185, in __call__
ctypes.byref(ret_val), ctypes.byref(ret_tcode)))
File "/home/kirlia/tvm/python/tvm/_ffi/base.py", line 71, in check_call
raise TVMError(py_str(_LIB.TVMGetLastError()))
tvm._ffi.base.TVMError: [14:44:54] /home/kirlia/tvm/src/schedule/bound.cc:158: Check failed: it != rmap->end()
Any ideas? Thanks in advance.