How to deal with prim::DictConstruct

I try to compile a torchscript module with tvm, but there is an op named “prim::DictConstruct” , how to convert this kind of op to relay ir, any suggestion will be helpful

Relay doesn’t support a dictionary, so we cannot convert this op. If you are using huggingface, there is return_dict option that you can use, see Issue: Converting model from pytorch to relay model - #5 by popojames

Otherwise, if the dictionary is the output of your model, you can manually turn it into a tuple, like this: https://github.com/masahi/torchscript-to-tvm/blob/master/transformers/bert_clean.py#L25-L32

thanks for the reply

Hello @masahi

I have asked my questions and found out solution by myself in

I am facing prim::DictConstruct issue again. because I am customizing BERT model with BERT config, and there is no return_dict option I can use in this regard. Can you explain how exactly you mean “you can manually turn it into a tuple”?

Here is my setting:

from transformers import (BertTokenizer, BertConfig, BertModel)

np_input = torch.tensor(np.random.uniform(size=[1, 128], low=0, high=128).astype("int32"))

BERTconfig = BertConfig(hidden_size=768, num_hidden_layers=round(12*depth_multipliers), num_attention_heads=12, intermediate_size=3072)

model = BertModel(BERTconfig)
model = model.eval()
traced_script_module = torch.jit.trace(model, np_input, strict=False).eval()
mod, params = relay.frontend.from_pytorch(traced_script_module, input_infos=shape_dict)

Thanks!

You can try something like

1 Like

Hello @masahi

I have tried using your TraceWrapper and it works on BertForSequenceClassification model.

Thanks :slight_smile: