Does TVM only support forward propagation?
As far as I know, TVM is not focussed on training right now, though in theory it could be adapted for it.
You can see earlier threads discussing the topic [thread 1, thread 2]
Facebook was working on a PyTorch+TVM integration for a while that could accelerate training, but the project is archived now.
As thread 2 shows, OctoML is working on something, but afaik it’s not polished and available yet.
You can also see there are some talks at this years TVM Conference 2021 (Dec 15th-17th) which discuss training.
If I want to use pytorch to train a simple net (e.g. Lenet), should I import the new model with back propagation operators? And then add new operators in Relay?
Right now I think that the gradients would be ignored by the Relay PyTorch importer. Additionally, adding the operations in Relay might be non-trivial, as you’ve got to have the weights as non-parameters, and a way of storing the gradients.
You can see various pull requests with the tag [Training] that are currently trying to add nicer support for this at a low-level of TVM, but it may be a while until it’s easy to add custom trainable ops.
thank you for your advice, by the way, do you know how to import the model which needs to be trained?