I want to use MetaSchedule to tune the topi expresions, but I met this problem.
Here is my source code:
Why the sch becomes NoneType, how can I solve the problem? ThanksI want to use MetaSchedule to tune the topi expresions, but I met this problem.
Here is my source code:
Why the sch becomes NoneType, how can I solve the problem? ThanksHere is the Topi code. The strangest thing is that, when I annotated the qk calculation, and just return q or k, it worked well.
So what’s wrong with my code or metaschedule?
q_weight = topi.transpose(self.linear0_weight, axes=[1,0])
q_weight = topi.broadcast_to(self.linear0_weight, shape=[8,768,768])
q = topi.nn.batch_matmul(self.data, q_weight, transpose_b=False) #[8,128,768]
q = topi.reshape(q, [8,128,768])
#q = topi.nn.bias_add(q, self.linear0_bias, axis=-1)
q_bias = topi.expand_dims(self.linear0_bias, axis=0, num_newaxis=2)
q = topi.add(q, q_bias)
q = topi.reshape(q, [8,128,12,64])
q = topi.transpose(q, axes=[0,2,1,3]) #[8,12,128,64]
k_weight = topi.transpose(self.linear1_weight, axes=[1,0])
k_weight = topi.broadcast_to(self.linear1_weight, shape=[8,768,768])
k = topi.nn.batch_matmul(self.data, k_weight, transpose_b=False)
k = topi.reshape(k, [8,128,768])
#k = topi.nn.bias_add(k, self.linear1_bias, axis=-1)
k_bias = topi.expand_dims(self.linear1_bias, axis=0, num_newaxis=2)
k = topi.add(k, k_bias)
k = topi.reshape(k, [8,128,12,64])
k = topi.transpose(k, axes=[0,2,1,3])
k = topi.transpose(k, axes=[0,1,3,2]) #[8,12,64,128]
q = topi.reshape(q, newshape=[96,128,64]) #[96,128,64]
k = topi.reshape(k, newshape=[96,64,128]) #[96,64,128]
qk = topi.nn.batch_matmul(q, k, transpose_b=False) #[96,128,128]
According to your log, there is no valid schedule during tuning, so the output can be None.
The current meta-schedule rules do not support multi-matmul in a single primfunc
Thanks a lot!
But I tried Ansor later, and Ansor can’t work either…