swin transformer 代码:非官方实现,但是好理解?。
将训练好的 pth 转 onnx 代码:
import torch
from swin_transformer_pytorch import swin_t
pth_filename = './demo.pth' # 训练好的权重
onnx_filename = './demo.onnx'
net = swin_t()
weights = torch.load(pth_filename)
#net.load_state_dict(weights)
net.load_state_dict({k.replace('module.', ''): v for k, v in weights['embedding'].items()})
net.eval()
dummy_input = torch.randn(1, 3, 224, 224, device='cpu')
torch.onnx.export(net, dummy_input, onnx_filename,
input_names=['input'], output_names=['ouput'],
export_params=True, verbose=False, opset_version=12,
dynamic_axes={'input':{0:"batch_size"},
'output':{0:"batch_size"}})
print('save onnx succ')
出现的错误:
1、“Exporting the operator roll to ONNX opset version 12 is not supported.”
修改 roll 为 cat:
class CyclicShift(nn.Module):
def __init__(self, displacement):
super().__init__()
self.displacement = displacement
def forward(self, x):
#return torch.roll(x, shifts=(self.displacement, self.displacement), dims=(1, 2))
x=torch.cat((x[:,self.displacement:,:,:], x[:,:self.displacement,:,:]), dim=1)
x=torch.cat((x[:,:,self.displacement:,:], x[:,:,:self.displacement,:]), dim=2)
return x
2、“RuntimeError: Expected node type 'onnx::Constant', got 'onnx::Cast'.”
把 “对切片做自加减法” 替换成 cat:
class WindowAttention(nn.Module):
...
def forward(self, x):
...
#if self.shifted:
#dots[:, :, -nw_w:] += self.upper_lower_mask
#dots[:, :, nw_w - 1::nw_w] += self.left_right_mask
if self.shifted:
dots = rearrange(dots, 'b c (n_h n_w) h w -> b c n_h n_w h w', n_h=nw_h, n_w=nw_w)
dots = torch.cat((dots[:, :, :-1], dots[:, :, -1:] + self.upper_lower_mask), dim=2)
dots = dots.permute(0,1,3,2,4,5)
dots = torch.cat((dots[:, :, :-1], dots[:, :, -1:] + self.left_right_mask), dim=2)
dots = dots.permute(0,1,3,2,4,5)
dots = rearrange(dots, 'b c n_h n_w h w -> b c (n_h n_w) h w')
...
参考:
Pytorch转ONNX-实战篇2(实战踩坑总结)
|