首页 | 本学科首页   官方微博 | 高级检索  
     

改进MobileNet的图像分类方法研究
引用本文:高淑萍,赵清源,齐小刚,程孟菲.改进MobileNet的图像分类方法研究[J].智能系统学报,2021,16(1):11-20.
作者姓名:高淑萍  赵清源  齐小刚  程孟菲
作者单位:西安电子科技大学 数学与统计学院,陕西 西安 710126
摘    要:针对神经网络结构的特征提取能力不足以及在包含复杂图像特征的数据集上分类准确率不高的问题,本文提出了一种对MobileNet神经网络的改进策略(L-MobileNet)。将原标准卷积形式替换为深度可分离卷积形式,并将深度卷积层得到的特征图执行取反操作,通过深度卷积融合层传递至下一层;采用Leaky ReLU激活函数代替原ReLU激活函数来保留图像中更多的正负特征信息,并加入类残差结构避免梯度弥散现象。与6种方法进行对比,实验结果表明:L-MobileNet在数据集Cifar-10、Cifar-100(coarse)、Cifar-100(fine)和Dogs vs Cats上平均准确率和最高准确率都取得了最佳结果。

关 键 词:卷积神经网络  图像分类  特征提取  MobileNet  深度可分离卷积  激活函数  Leaky  ReLU  残差结构

Research on the improved image classification method of MobileNet
GAO Shuping,ZHAO Qingyuan,QI Xiaogang,CHENG Mengfei.Research on the improved image classification method of MobileNet[J].CAAL Transactions on Intelligent Systems,2021,16(1):11-20.
Authors:GAO Shuping  ZHAO Qingyuan  QI Xiaogang  CHENG Mengfei
Affiliation:School of Mathematics and Statistics, Xidian University, Xi’an 710126, China
Abstract:This paper proposes an improved strategy for the MobileNet neural network (L-MobileNet) because the feature extraction ability of a neural network structure is insufficient, and the classification accuracy is not high on the dataset containing complex image features. First, the original standard convolution form is replaced by the depth separable convolution form, and the feature map obtained from the deep convolution layer is reversed and transferred to the next layer through the deep convolution fusion layer. Second, the leaky ReLU activation function is used to replace the original ReLU activation function to retain more positive and negative feature information in the image, and residual structure is added to avoid the gradient diffusion phenomenon. Finally, the experimental results showed that when compared with six methods, L-MobileNet achieved the best results in the datasets of Cifar-10, Cifar-100 (coarse), Cifar-100 (fine), and Dogs vs Cats.
Keywords:convolutional neural network  image classification  feature extraction  MobileNet  depth separable convolution  activation function  Leaky ReLU  residual structure
点击此处可从《智能系统学报》浏览原始摘要信息
点击此处可从《智能系统学报》下载全文
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号