首页 | 本学科首页   官方微博 | 高级检索  
     

面向动态场景去模糊的对偶学习生成对抗网络
引用本文:纪野,戴亚平,廣田薰,邵帅.面向动态场景去模糊的对偶学习生成对抗网络[J].控制与决策,2024,39(4):1305-1314.
作者姓名:纪野  戴亚平  廣田薰  邵帅
作者单位:北京理工大学 自动化学院,北京 100081;北京理工大学 复杂系统智能控制与决策国家重点实验室,北京 100081
基金项目:国铁集团系统性重大项目(P2021T002);北京市自然科学基金项目(L191020).
摘    要:针对动态场景下的图像去模糊问题,提出一种对偶学习生成对抗网络(dual learning generative adversarial network, DLGAN),该网络可以在对偶学习的训练模式下使用非成对的模糊图像和清晰图像进行图像去模糊计算,不再要求训练图像集合必须由模糊图像与其对应的清晰图像成对组合而成. DLGAN利用去模糊任务与重模糊任务之间的对偶性建立反馈信号,并使用这个信号约束去模糊任务和重模糊任务从两个不同的方向互相学习和更新,直到收敛.实验结果表明,在结构相似度和可视化评估方面, DLGAN与9种使用成对数据集训练的图像去模糊方法相比具有更好的性能.

关 键 词:动态场景去模糊  对偶学习  生成对抗网络  注意力引导  特征图损耗函数

Dual learning generative adversarial network for dynamic scene deblurring
JI Ye,DAI Ya-ping,Kaoru HIROTA,SHAO Shuai.Dual learning generative adversarial network for dynamic scene deblurring[J].Control and Decision,2024,39(4):1305-1314.
Authors:JI Ye  DAI Ya-ping  Kaoru HIROTA  SHAO Shuai
Affiliation:School of Automation,Beijing Institute of Technology,Beijing 100081,China;State Key Laboratory of Intelligent Control and Decision of Complex System, Beijing Institute of Technology,Beijing 100081,China
Abstract:For the problem of dynamic scene deblurring, a dual learning generative adversarial network(DLGAN) is proposed in this paper. The network can use unpaired blurry and sharp images to perform image deblurring calculations in the training mode of dual learning, which no longer requires the training image set to be a pair of blurry and their corresponding sharp images. The DLGAN uses the duality between the deblurring task and the reblurring task to establish a feedback signal, and uses this signal to constrain the deblurring task and the reblurring task to learn and update each other from two different directions until convergence. Experimental results show that the DLGAN has a better performance compared to nine image deblurring methods trained with paired datasets in structural similarity and visualization evaluation.
Keywords:
点击此处可从《控制与决策》浏览原始摘要信息
点击此处可从《控制与决策》下载全文
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号