AttributeError: module 'tvm.relay.transform._ffi_api' has no attribute 'Conv2dToSparse2'

In Sparse Conv2d Implementation for 3x3 kernels, there is a sample usages to demo the implementation.

I try to use the conversion in the sample usages to convert conv2d to sparse_conv2d. The conversion I use shows below:

And the error message say that:

AttributeError: module 'tvm.relay.transform._ffi_api' has no attribute 'Conv2dToSparse2'

I have no idea why this error occurs. Can anyone helps me fix this error?

did you try rebuilding?

@vinx13 Excuse me, what do u mean rebuilding?

pulling the latest code and cd build; make

@vinx13 I only pull the latest code. After I build it, it works. thx.

But I have another question is that are there any restrict for the function relay.transform._ffi_api.Conv2dToSparse2 to do the conversion?

I convert a simple pytorch model that contants just a single conv2d with 3*3 kernel to relay and do the conversion shows below:

The conv2d didn’t convert to sparse_conv2d. My model be like:

class LeNet(nn.Module):
    
    def __init__(self):
        super(LeNet, self).__init__()
        self.conv1 = nn.Conv2d(1, 1, 3, padding=1, bias=False)

    def forward(self, x):
        x = self.conv1(x)
        
        return x

Anything wrong or miss on my works?

Did you pass in the weight param when you create the relay model? The IR should look like nn.conv2d(%data, meta[relay.Constant][0] where the weight is a constant

@vinx13 I convert my model to relay graph follows the documents: Compile PyTorch Models.

And here is my code:

model = LeNet().to(device=device).eval()
example = torch.randn(1,1,28,28)
traced_module = torch.jit.trace(model, example).eval()


shape_list = [(i.debugName().split('.')[0], i.type().sizes()) for i in  list(traced_module.graph.inputs())[1:]]
mod, params = tvm.relay.frontend.from_pytorch(traced_module, shape_list)

Did I miss something to pass the weight param?

you probably need to load params when you create model in pytorch

@vinx13 Sorry… I can’t get the point. :cry: What do u mean load params when create model in pytorch? Doesn’t it already has its own weight when I create the model?

I can print it out by :

for name, param in model.named_parameters():
    print(f"Layer: {name} | Size: {param.size()} | Values : {param[:2]} \n")

I see, it actually has own weight. You will need to call mod = relay.build_module.bind_params_by_name(mod['main'], params) after from_pytorch

@vinx13 It works! Thx a lot!