Opt_level=1, will ignore the error: divide by zero

Hi all,

When I compiled a computational graph containing divide 0 with opt_level =2,3, the compilation crashed and threw the “TVMError !b.is_const(0): divide by zero” error message as our expected.

However, when opt_level =1, the compilation passed without any error message. How strange it is.

The graph is as follows:

fn (%p0: int64, Primitive=1) -> int64 {
  divide(%p0, 0 /* ty=int64 */) /* ty=int64 */
}

Reproducible script

import tvm
from tvm import relay
var_0 = relay.var("var_0", dtype = "int64", shape = ())
const_1 = relay.const(0, dtype = "int64")
var_2 = relay.divide(var_0, const_1)
tuple = relay.Tuple([var_2,])
F = relay.Function([var_0], tuple)
mod = tvm.IRModule()
mod['main'] = F

with tvm.transform.PassContext(opt_level=1,):    # opt=1: run well; opt=2,3: crash
    ex = relay.create_executor("vm", mod=mod, target='llvm')

The following script has similar problem:

opt_level =1 : run well, no any warning or crash message! Why??

opt_level =2,3 crash with crash message: Floating point exception (core dumped)

import tvm
from tvm import relay
const_1 = relay.const(1, dtype='int32')
const_2 = relay.const(0, dtype='int32')
const_3 = relay.divide(const_1, const_2)
F = relay.Function([], const_3)
mod = tvm.IRModule()
mod['main'] = F
mod = relay.transform.InferType()(mod)
print(mod)
with tvm.transform.PassContext(opt_level=1):   # opt_level=1: run well! ;  opt_level=2,3: crash!
    ex = relay.create_executor("vm", mod=mod, target='llvm')