首页 | 本学科首页   官方微博 | 高级检索  
     

基于风格迁移的手势分割方法
引用本文:陈明瑶,徐琨,李晓旋. 基于风格迁移的手势分割方法[J]. 计算机与现代化, 2021, 0(5): 20-25. DOI: 10.3969/j.issn.1006-2475.2021.05.004
作者姓名:陈明瑶  徐琨  李晓旋
作者单位:长安大学信息工程学院,陕西 西安 710064
基金项目:国家自然科学基金资助项目(61703054); 陕西省重点研发计划项目(2018ZDXM-GY-044); 航天预研项目(030101)
摘    要:基于全卷积神经网络的手势分割方法过于依赖大量精准标注的训练样本,同时由于提取特征中缺乏足够的上下文信息,常出现类内不一致的错分现象。针对上述问题,本文提出一种基于风格迁移的手势分割方法。首先选择HGR-Net手势分割网络的前5层作为主干网络,并在主干网各层添加上下文信息增强层,使用全局均值池化操作,结合通道注意机制,增强显著性特征通道的权值,保证特征上下文信息的连续性,从而解决类内不一致问题;其次,本文还提出一种基于风格迁移的领域自适应方法,使用VGG网络,对源域测试图像进行风格迁移预处理,使其同时具有自身内容和目标域训练样本图像的风格,提高本文的手势分割模型的泛化能力,从而解决跨域样本的分割问题。使用OUHANDS数据集进行测试,本文的手势分割结果mIoU和MPA分别为0.9143和0.9363,较HGR-Net手势分割网络提高了3.2个百分点和1.8个百分点。使用本文的风格迁移方法,并在自采集数据集上进行测试,迁移后的mIoU和MPA值分别提高了19个百分点和23个百分点。本文的风格迁移领域自适应方法为无标记样本的跨域分割提供了一种新的思路。

关 键 词:手势分割  HGR-Net  上下文  风格迁移  
收稿时间:2021-06-03

A Hand Gesture Segmentation Method Based on Style Transfer
CHEN Ming-yao,XU Kun,LI Xiao-xuan. A Hand Gesture Segmentation Method Based on Style Transfer[J]. Computer and Modernization, 2021, 0(5): 20-25. DOI: 10.3969/j.issn.1006-2475.2021.05.004
Authors:CHEN Ming-yao  XU Kun  LI Xiao-xuan
Abstract:Hand gesture segmentation based on fully convolutional networks excessively dependents on the accurate per-pixel annotations of training data. At the same time, the features lack enough context information, which often leads to misclassification with intra-class inconsistency. In order to solve the above issues, a hand gesture segmentation method based on style transfer is proposed. Firstly, the first five layers of hand gesture segmentation network in HGR-Net are selected as the backbone network, and the context information enhancement layer is added to each layer of backbone network. In the context information enhancement layer, global average pool operation and channel attention mechanism are adopted to enhance the weight of the discrimination feature and ensure the continuity of context information in features, so as to solve the intra-class inconsistency. Secondly, in order to improve the generalization ability of the hand gesture segmentation module proposed by this paper, and address the cross-domain samples segmentation problem, a domain adaptive method based on style transfer is proposed. The pre-trained VGG model is used to transfer the source domain testing sample, so as to make the source domain testing sample have both its content and the style of the target domain training sample. Testing on the OUHANDS dataset, the mIoU and MPA values of the proposed method are 0.9143 and 0.9363 respectively, and they are 3.2 and 1.8 percentage points higher than those of HGR-Net. Testing on the self-collection dataset with the style transfer method, the mIoU and MPA values are respectively 19 and 23 percentage points higher than without this method. The domain adaptive method based on style transfer provides a new idea for cross-domain segmentation of unlabeled samples.
Keywords:hand gesture segmentation  HGR-Net  context  style transfer  
本文献已被 万方数据 等数据库收录!
点击此处可从《计算机与现代化》浏览原始摘要信息
点击此处可从《计算机与现代化》下载全文
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号