Hi, I have questions about the batch size when running vta.
Is it possible to run batch size more than 1?
Thank you in advance.
Hi, I have questions about the batch size when running vta.
Is it possible to run batch size more than 1?
Thank you in advance.
Yes, it is; we are working on a follow up Relay pass (@MarisaKirisame) that will let us perform batched inference, and I will update you when it’s ready.
With the new patch I will update the tutorial to run batched inference out of the box.
Please see the following patch to run batched inference: https://github.com/dmlc/tvm/pull/3661