首页 | 本学科首页   官方微博 | 高级检索  
     

基于深度学习的布匹疵点检测方法
引用本文:李宇,刘孔玲,黄湳菥.基于深度学习的布匹疵点检测方法[J].毛纺科技,2021,49(4):98-103.
作者姓名:李宇  刘孔玲  黄湳菥
作者单位:武汉纺织大学 湖北省数字化纺织装备重点实验室 湖北 武汉 430200;武汉纺织大学 湖北省功能纤维加工及检测工程技术研究中心 湖北 武汉 430200;武汉纺织大学 湖北省数字化纺织装备重点实验室 湖北 武汉 430200;武汉纺织大学 电子与电气工程学院,湖北 武汉 430200
摘    要:为快速、准确检测布匹疵点,提出以深度学习目标检测框架YOLOv4为基础的布匹疵点检测方式,首先将5种常见疵点图像(吊经、百脚、结点、破洞、污渍)进行预处理,然后将图像输入到YOLOv4算法中进行分类。YOLOv4采用CSPDarknet53作为主干网络提取疵点特征,SPP模块、FPN+PAN的方式作为Neck层进行深层疵点特征提取,预测层采用3种尺度预测方式,对不同大小的疵点进行检测。研究结果表明:经600个测试集样本的验证,该方法对疵点图像的检测准确率达95%,检测单张疵点图像的速率为33 ms。与SSD、Faster R-CNN、YOLOv3方法进行比较,采用YOLOv4方法准确率更高,速度更快。

关 键 词:深度学习  布匹疵点检测  YOLOv4  CSPDarknet53  SPP

Fabric defect detection method based on deep learning
Affiliation:(Hubei Key Laboratory of Digital Textile Equipment,Wuhan Textile University,Wuhan,Hubei 430200,China;Hubei Functional Fiber Processing and Testing Engineering Technology Research Center,Wuhan Textile University,Wuhan,Hubei 430200,China;College of Electronics and Electrical Engineering,Wuhan Textile University,Wuhan,Hubei 430200,China)
Abstract:In order to detect fabric defects quickly and accurately,a fabric defect detection method based on the deep learning object detection framework YOLOv4 was proposed.Firstly,five kinds of common defect images(hanging warp,double picks,node,hole and stain)were preprocessed,and then the images were input into YOLOv4 algorithm for classification.YOLOv4 used CSPDarknet53 as the backbone network to extract defect features;SPP module and FPN+PAN were used as Neck layer to extract deep defect features.In the prediction layer,three scale prediction methods were used to detect defects of different sizes.The results show that with the verification of 600 test samples,the detection accuracy of this method is 95%and the detection rate of single defect image is 33 ms.Compared with SSD,Faster R-CNN and YOLOv3,YOLOv4 has a higher accuracy and faster speed.
Keywords:learning  fabric defect detection  YOLOv4  CSPDarknet53  SPP
本文献已被 CNKI 维普 万方数据 等数据库收录!
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号