Avoid tuning pretuned layers that are in mali_v0.05.log

Hi,

I am trying to tuned the ResNet50 (224, 224, 3), the keras model using keras.applications.resnet50.ResNet50(include_top=True, input_shape=(224, 224, 3), classes=1000)

However, the autotvm is still tuning the layers that were already tuned in mali_v0.05.log. I wonder if this is normal and how could I “not” tune the pretuned layers.

Thanks!

AutoTVM doesn’t judge if a task has already been tuned and available on TopHub or somewhere. As long as you run tune, AutoTVM will definitely tune it.

In your case, if you know all ResNet 50 ops are already being tuned and available on TopHub (which is true actually), you can totally skip the AutoTVM and just follow the tutorial you posted to build and deploy.

Hi @comaniac,

Thanks for the quick response. I actually got warning for two layers:

Cannot find config for target=opencl -device=mali -model=rk3399, workload=('conv2d', (1, 3, 230, 230, 'float32'), (64, 3, 7, 7, 'float32'), (2, 2), (0, 0), (1, 1), 'NCHW', 'float32'). A fallback configuration is used, which may bring great performance regression.
Cannot find config for target=opencl -device=mali -model=rk3399, workload=('dense', (1, 2048, 'float32'), (1000, 2048, 'float32'), 0, 'float32'). A fallback configuration is used, which may bring great performance regression. 

I’ve checked the conv2d(1, 3, 230, 230) is not in the mali_v0.05.log, however, the dense(1, 2048) is in mali_v0.05.log.

I’ve checked the mali dense schedule and it does query the TopHub, so I have no idea why it failed to find the dense one. Will need more investigations.

1 Like

Got it, thanks for the help!