Model Inference within browser

I tried the demo of Web Stable Diffusion. It works great in my browser. However, when I look into the code, I noticed that its tvmjs.bundle.js is quite different from mine. For example, it uses createVirtualMachine(dev) method in their code, but I don’t find it in TVM repo. I am wondering what’s the right way to do model inference within browser using webgpu after creating tvmjs instance.

you can find these implementations in the unity branch

1 Like