Runtime memory calculation

Hi @tvm_user, I wanted to know the exact runtime memory requirements for my models to perform the inference per sample. I am expecting that this would help me to get an idea whether that model is able to perform inference over an edge platform or not.

As per my understanding, there are three types of memory requirements,

  1. To store the model weights
  2. Input output variables
  3. Intermediate calculations

Can you suggest some tools using which I can get these memory requirements for models that is performing inference with c++ API?