Error using tvmc even after successfully installing tvm

I have recently started learning about TVM. After taking guidance from official documentation of mlc-llm I have successfully installed tvm using - “python3 -m pip install --pre -U -f https://mlc.ai/wheels mlc-ai-nightly” [in conda environment on windows] and also validated tvm, but when I run '“tvmc --h” it gives attached error: issue

I have also tried in Windows virtual environment and also referred to tlcpack.ai but still the error persists.

Is there any issue while installing or any missing command? Don’t know what’s causing this, but I’d appreciate some help from the community.

Hi @appnet, I suspect this is an issue on windows as I cannot reproduce on my Linux machine. This error originates from here and seems to be happening because no candidate path could be found for the JSON configs.

This feature doesn’t really seem to be in use, so to work around the issue for now, could you try to set TVM_CONFIGS_JSON_DIR="." in your environment and try running tvmc -h again?

Thank you @lhutton1, The addition of TVM_CONFIGS_JSON_DIR="." really helped, but I had to manually create an empty default.json file in the working directory. Originally, this was the main cause of the error, and I believe many Windows users installing TVM encountered the same issue due to the same cause.

After completing all the setup steps, tvmc -h was successfully executed.

No worries, I don’t have access to a Windows machine myself, but it would be great if you could create an issue in github.com/apache/tvm to track this bug