And thatâs where I get an error that target âwasmâ is undefined. So my question is - how to build TVM using wasm correctly? How to setup target properly?
My OS: Windows 10
TVM without LLVM, because apparently itâs not trivial at all to build TVM with LLVM because LLVM does not have LLVMConfig.cmake on Windows for some reason. For the time being I just want to make it work even without LLVM. LLVM I can leave for later. Or can I?
Thank you in advance for your response and thank you for such a beautiful library!
We do need to rely on LLVM unfortunately, you can however try use the windows pre-build that comes with llvm support or build with conda that comes with llvm env
I finally figured out what LLVM to build and I managed to create .dll file from my model successfuly. On a side note: I would advise to add an additional tutorial on how to build LLVM for Windows, because on Windows pre-built binaries will not work. Actually I can volunteer and create this tutorial for you, if you donât mind.
On this URL - I went through it, and it looks like out-of-box solution for creating full WASM library from the model, but this is not what I need. I have big C++ library that is compiled to WASM, and inside this library I need to run an inference on my model, and I want to use TVM for that. What is unclear for me is how to load TVM model if I am in WASM environment.
Can you provide some other examples of successful use of TVM in WASM environment?
For building things through wasm, you likely need to build libtvm runtime into (.bc file) the original compiled model(also in .bc file) and link together with other bc files in your library. Then you can call into tvmâs c++ runtime. The webSD project can still serve as a good reference as the build libtvm runtime and compiled model are the same, except that you need to link in extra things(from your project)
I have this line in my code, and lib_path is pointing to compiled model library. In web environment there is no such thing as a defined local path like we have on Desktop. It is either URL or bite stream. Unfortunately, there is no load function that could load the library from the bite stream. So my question is - is âLoadFromFileâ capable of working with URLs as well as with local paths?
This is also how we build models for the web stable diffusion model, you can follow the web stable diffusion example on how to build something with system lib
I understand it might be because my libtvm and libtvm_runtime I compiled previously is not WASM-compatible. Although I was under the impression that through tvm/web I will be able to compile libtvm that knows this target, apparently I was mistaken. So my question is - how to actually compile/install TVM in a way that I can produce WASm library from my model?
I am continuing looking into Web Stable Diffusion, but for the time being I can not find an information on how to actually make this wasm target available.
Yes, I got this, I got tvm_runtime.bc file that I will âmergeâ with my model library file, but I need to compile my model with wasm configuration. I have noticed that I was building tvm not on âunityâ branch, my mistake. Can it be an issue? Thanks!
I have installed TVM with LLVM using Conda environment, went to âwebâ folder and run âmakeâ command successfuly, then went to âpythonâ folder and successfuly installed TVM Python package.
After that I tried to run prepare_test_libs.py and I got this error:
RuntimeError: Compilation error:
wasm-ld: warning: Linking two modules of different data layouts: '/tmp/tmpbbb6nyj3/lib0.bc' is 'e-m:e-p:32:32-p10:8:8-p20:8:8-i64:64-n32:64-S128-ni:1:10:20' whereas 'ld-temp.o' is 'e-m:e-p:32:32-i64:64-n32:64-S128'
wasm-ld: warning: Linking two modules of different target triples: '/tmp/tmpbbb6nyj3/lib0.bc' is 'wasm32-unknown-unknown-wasm' whereas 'ld-temp.o' is 'wasm32-unknown-emscripten'
So the text of an error is self-explanatory, so I changed the target in prepare_test_libs.py in prepare_tir_lib function from
RuntimeError: Compilation error:
wasm-ld: warning: Linking two modules of different data layouts: '/tmp/tmp9xkhf6sz/lib0.bc' is 'e-m:e-p:32:32-p10:8:8-p20:8:8-i64:64-f128:64-n32:64-S128-ni:1:10:20' whereas 'ld-temp.o' is 'e-m:e-p:32:32-i64:64-n32:64-S128'
wasm-ld: /b/s/w/ir/cache/builder/emscripten-releases/llvm-project/llvm/lib/Bitcode/Reader/MetadataLoader.cpp:366: (anonymous namespace)::(anonymous namespace)::PlaceholderQueue::~PlaceholderQueue(): Assertion `empty() && "PlaceholderQueue hasn't been flushed before being destroyed"' failed.
Could you please guide me how to solve this issue? Thank you very much!
I think I have figured out an issue. emsdk 2.0.15 has clang 13.0 version. I believe clang version is the reflection of LLVM version in that case. I have updated emsdk to 2.0.30 - it was an educated guess just to find Emscripten that has Clang 14.0.
After that python prepare_test_lips.py worked. I think it will be great to reflect that in the specs, I have looked through them multiple times and didnât find an advise to check emsdk LLVM and TVM LLVM compatibilities. If I missed it, sorry, if not - please add it
Although I ran prepare_test_libs.py successfuly, I still have a question. It produces .wasm file but I need a static library that will be WASM compatible. For example, when I compile my code using Emscripten it produces .wasm and .a file, and this .a file I can later use in CMakeLists.txt, use itâs internals in C++ code and build another WASM package that I can use in the web browser.
So my question is - how to generate WASM library for TVM runtime, not .wasm file? Do I just need to change the format of file from â.wasmâ to â.aâ? Sorry if my questions are stupid, I am new to WASM and TVM. Thank you for your response!
you can try to do mod.export_library("data.tar") which should give you a taball that contains the necessary .o files. you still need to link libtvm runtime
Hello @tqchen! What I canât seem to find is an example on how to build my model to .bc file. I have managed to build .wasm library from my model and even .a static library (thanks to this example - GitHub - kazum/tvm-wasm: Build pure WebAssembly from pre-trained DL model). Also I didnât find an example on how to build libray into .bc file in Web Stable Diffusion repo.
Could you please advise how to do that?
So I understood that I need to have - library.bc (need to build it), wasm_runtime.bc (already done), then I need to link those two using llvm-link, after I can compile it to .a static library, link to my project, and then I can init the model in C++ code like so: