It seems that most forward operators are implemented in TOPI. Is there any plan to support the backward operators?
With backward operators implemented, both training and inference can benefit from TVM’s optimization.
For sure! The key is to keep the TOPI operators generic and composable, though. So, for instance, instead of making an _avg_pool2d_grad
and _conv2d_transpose
operator, one would make only the latter and allow specifying a filter.
It’s also “in the works” to allow expanding NNVM nodes into primitive ops which already have grads to get a sort of “autograd” for free.
For most operators, backward is implemented as an attribute using other existing operators, instead of creating standalone backward operators. An example: https://github.com/dmlc/nnvm/blob/master/src/top/tensor/elemwise.cc#L49