Hi,
A basic question, does TVM support any of the classical ML models such as Random Forest? Is there a way to convert these models to DNN and then use TVM?
Thank you
Hi,
A basic question, does TVM support any of the classical ML models such as Random Forest? Is there a way to convert these models to DNN and then use TVM?
Thank you
A great question! We do have support for classical ML models via hummingbird project. See their notebook hummingbird/tvm_and_pyt_graph.ipynb at main · microsoft/hummingbird · GitHub for an example.
They have a benchmark script hummingbird/benchmarks/trees at main · microsoft/hummingbird · GitHub you can use to compare the performance of sklearn, xgb, lgb against TVM converted model. For example, here is how I compare GPU performance of XGB model and TVM:
hummingbird/benchmarks/trees$ python run.py -operator xgb -gpu -backend hb-tvm -niters 100 -batch_benchmark -batch_size 1000 -max_depth 8 -ntrees 500 -dataset fraud,epsilon,year,covtype,higgs
Let me know if you need help running this script.
I’ve also uploaded a simple usage demonstration here hb_tvm_example/blog_examples.py at master · masahi/hb_tvm_example · GitHub
I’m working on a blog post on hummingbird + tvm which should be out soon.
@hmasoum The blog on hummingbird + TVM just published in Compiling classical ML for performance gains (up to 16x) and hardware portability | by Jason Knight | OctoML | Feb, 2021 | Medium