Standalone execution is possible after all?

Thank you for visiting my question.

I’m now trying to run inference on bare-metal devices. More specifically, I’d like to use micro tvm(or the outputs from micro tvm) on MCUs.

However, I went through some tutorials or notebooks created by developers:

and found that MCUs must be connected to the host machine anyway.

Is it possible for me to download some binary into MCUs, and run inference without connecting to the host PC just as I can do with TFL micro?

I’m not sure when MCUs have to communicate with the host PC to run inference, and when standalone MCUs can run inference without being connected to the host PC.

Could anyone help me understand the mechanism?

Hi @sho ,

If you’re looking for a demo of a standalone application running on an MCU, you could take a look at . Although this demo is an example of how to use TVM to run a model and offload operators to the microNPU, it should provide a good starting point for running standalone on bare metal.

@grant-arm Thanks a lot for the link. I’ll have a look at the repo and see if standalone execution is possible for any MCUs.