Hi All,
I’ve followed the deployment tutorials as well as the sample code provided at apps/howto_deploy successfully and everything works well. I develop and build on linux/x86 and deploy to aarch64/android natively as a C++ module compiled to so, with the necessary libtvm_runtime.so.
However, the example provided int apps/howto_deploy/cpp_deploy.cc does not show how to read any metadata about the neural network which I deployed. I want to check input and output shapes, data types, count the number of ops, tensors etc. This is required by the larger software context with which I want integrate TVM.
I’ve also saved the execution graph as a JSON file, but couldn’t find a hard spec for this JSON.
Can you suggest some way to do this?
Thanks