Hello,
I see in sources a special device type kDLCPUPinned for pinned memory. However, there no examples or explanations. I tried to set the device_type when creating a module like:
ctx = tvm.gpu (0)
ctx.device_type = 3 # kDLCPUPinned = kDLCPU | kDLGPU
module = runtime.create (graph, lib, ctx)
But then module.run() fails as follows:
File "./test.py", line 314, in test_main
module.run()
File "/mnt/tvm/python/tvm/contrib/graph_runtime.py", line 206, in run
self._run()
File "/mnt/tvm/python/tvm/_ffi/_ctypes/packed_func.py", line 237, in __call__
raise get_last_ffi_error()
tvm._ffi.base.TVMError: Traceback (most recent call last):
[bt] (4) /mnt/tvm/build/libtvm.so(TVMFuncCall+0x65) [0x7f4fcf159c25]
[bt] (3) /mnt/tvm/build/libtvm.so(tvm::runtime::GraphRuntime::Run()+0x37) [0x7f4fcf1e2607]
[bt] (2) /mnt/tvm/build/libtvm.so(+0x178b587) [0x7f4fcf1e2587]
[bt] (1) /mnt/tvm/build/libtvm.so(+0x1715695) [0x7f4fcf16c695]
[bt] (0) /mnt/tvm/build/libtvm.so(dmlc::LogMessageFatal::~LogMessageFatal()+0x82) [0x7f4fce530b82]
File "/mnt/tvm/src/runtime/library_module.cc", line 78
TVMError:
---------------------------------------------------------------
An internal invariant was violated during the execution of TVM.
Please read TVM's error reporting guidelines.
More details can be found here: https://discuss.tvm.ai/t/error-reporting/7793.
---------------------------------------------------------------
Check failed: ret == 0 (-1 vs. 0) : Assert fail: (2 == tir.tvm_struct_get(arg0, 0, 10)), Argument arg0.device_type has an unsatisfied constraint: (2 == tir.tvm_struct_get(arg0, 0, 10))
I’d very appreciate any hints/ideas/suggestions/examples. Thank you very much.
Lev.