[ Question ] Is Batch Normalization necessary when inference?

I understand that TVM does not currently support deep learning network training. However, TOPI and Relay seem to contain many layers necessary for training such as Batch Norm and DropOut.

In the case of Batch Norm, does it affect the result of inference? Or is there no problem with calculating without it?