BFloat16 on Arm

Hi All, I find there is BF16 support for GPU. Does TVM also support BF16 on Arm CPU? I have an ARM CPU, which supports BF16 and I have a graph from TensorFlow (the weights are in FP32). Can I run this model in BF16 on the Arm CPU. For Arm Compute Library, DNNL_DEFAULT_FPMATH_MODE can be set to BF16 to instruct it to dispatch FP32 workloads to Bfloat16 kernels. How can TVM do the similar thing. Thanks!

1 Like