最终排名 85/1159
无缘复赛,太菜了,不好意思说自己学去噪的了,代码会开源,但是感觉没什么人看吧
尝试过的模型
效果比较好的模型
- CycleISP:改进网络(不需要Variance),验证集53.87(e=160),测试集51.46分,使用warmup学习率验证集最高55.71
- MPRNet:使用warmup学习率,尝试加大网络结构(验证集57.01(e=473),测试集55.41分)
网络改进
- 尝试切块训练和数据增强:尝试分块训练MPRNet,训练比较久,很差
- 数据增强:很差
- 换了L1损失:没有改进
Attention Mechanisms in Computer Vision: A Survey https://arxiv.org/abs/2111.07624
通道: Squeeze-and excitation networks MPRNet用了:Image Super-Resolution Using Very Deep Residual Channel Attention Networks Ecanet: Efficient channel attention for deep convolutional neural networks Context encoding for semantic segmentation Global secondorder pooling convolutional networks Srm : A style-based recalibration module for convolutional neural networks Gated channel transformation for visual recognition Fcanet: Frequency channel attention networks Spatio-temporal channel correlation networks for action classification You look twice: Gaternet for dynamic filter selection in cnns Spsequencenet: Semantic segmentation network on 4d point clouds
空间: 70篇非常多
通道-空间: 19篇非常多,列举重要的 CBAM: convolutional block attention module,训练太慢了,停止 Dual attention network for scene segmentation,爆显存 SCA-CNN: spatial and channel-wise attention in convolutional networks for image captioning Bam: Bottleneck attention module,很多错误 Simam: A simple, parameter-free attention module for convolutional neural networks Coordinate attention for efficientmobile network design,可以训练,很差 SA-Net: Shuffle Attention for Deep Convolutional Neural Networks,可以训练,很差 MIRNet中的DAU,可以训练,很差
后续提升
损失函数:L2+L1+FFT(尝试中),MSSSIM,NAFNet中的PSNRloss 去模糊的网络:DeepRFT(signal number 32报错),mimonet 加入跳跃连接 NAFNet改参数([4,8] 8 [4,8]随机切图旋转) unet+mixup GAN的使用
大佬开源代码
第十七名:https://github.com/Fivethousand5k/ZTE_RawDenoise
|