The arguments mismatch (expect 2 but get 3) when converting pytorch aten::dropout to relay op

Following error will pops up when converting aten::dropout in our transformer-like model:

File “/icode-xtcl/baidu/xpu/xmir/python/tvm/relay/frontend/pytorch.py”, line 1367, in dropout return _op.nn.dropout(data, rate)
File “/icode-xtcl/baidu/xpu/xmir/python/tvm/relay/op/nn/nn.py”, line 1701, in dropout return expr.TupleWrapper(dropout_raw(data, rate), 2)[0]
File “/icode-xtcl/baidu/xpu/xmir/python/tvm/relay/op/nn/nn.py”, line 1724, in dropout_raw return _make.dropout(data, rate, “upscale_in_train”)
File “/icode-xtcl/baidu/xpu/xmir/python/tvm/_ffi/_ctypes/packed_func.py”, line 237, in call raise get_last_ffi_error()

Check failed: nargs == args.size() (2 vs. 3) : Expect 2 arguments but get 3

I am using torch 1.8 version and the code in model is like:
def init(self, config, num_labels=2):
super(BertForSequenceClassification, self).init(config)
self.num_labels = num_labels
self.bert = BertModel(config)
self.dropout = torch.nn.Dropout(config.hidden_dropout_prob).eval()

It’s aligned with test_forward_dropout in test_forward.py:
@tvm.testing.uses_gpu
def test_forward_dropout():
torch.set_grad_enabled(False)
input_shape = [1, 3, 10, 10]
input_data = torch.rand(input_shape).float()
verify_model(torch.nn.Dropout(p=0.5).eval(), input_data=input_data[0, 0])
verify_model(torch.nn.Dropout2d(p=0.5).eval(), input_data=input_data[0]) verify_model(torch.nn.Dropout3d(p=0.5).eval(), input_data=input_data) verify_model(torch.nn.AlphaDropout(p=0.5).eval(), input_data=input_data[0, 0])

In my dumped pytorch jit trace graph, it also displays it real has three arguments:
%mlp_output.22 : Tensor = aten::dropout(%input.201, %31, %11), scope: __module.transformer/__module.transformer.layers.21/__module.transformer.layers.21.mlp/__module.transformer.layers.21.mlp.dropout # /venv/lib/python3.6/site-packages/torch/nn/functional.py:1076:0

We root caused that it’s issue of our internal tvm branch. Sorry for inconvinece.