Hi There,
I have following testing network for a accuracy issue trouble shooting, some wired things is after do relay.build some weights value in param get changed (very small), that just like before relay.build the value is 2.1122932 , after relay.build value get changed into 2.1122828, if remove nn.batch_norm in my network , or add ‘annotation.stop_fusion’ issue will go away, another related things once change opt_level into a small value like 2, issue go away too, could I know is this a expected behavior? and if yes, what is the purpose to do such small change for the input param when using opt_level == 3?
Regards
Hua
simple_net = relay.nn.conv2d(data=data, weight=weight1,
padding=(1, 1), strides=[1, 1],out_dtype="float32")
simple_net = relay.nn.batch_norm(simple_net, bn_gamma[0],
bn_beta[0], bn_mmean[0], bn_mvar[0])[0]
simple_net = relay.nn.max_pool2d(simple_net, pool_size=[1, 1], strides=[2, 2])
simple_net = relay.Function(relay.analysis.free_vars(simple_net), simple_net)
net, origin_params = testing.create_workload(simple_net)
target = 'llvm'
target_host = 'llvm'
ctx = tvm.cpu(0)
with relay.build_config(opt_level=3, disabled_pass={"AlterOpLayout"}):
graph, lib, params = relay.build(net,
target=target,
target_host=target_host,
params=origin_params)
#print params and origin_params
(0, 0, 0) origin_params___ [‘weight1’] [2.1122932 5.5735774 2.0870073]
(0, 0, 0) relay.build params [‘weight1’] [2.1122828 5.5735493 2.0869968]