首页 | 官方网站   微博 | 高级检索  
     

基于显著性检测和稠密轨迹的人体行为识别
引用本文:鹿天然,于凤芹,杨慧中,陈 莹.基于显著性检测和稠密轨迹的人体行为识别[J].计算机工程与应用,2018,54(14):163-167.
作者姓名:鹿天然  于凤芹  杨慧中  陈 莹
作者单位:江南大学 物联网工程学院,江苏 无锡 214122
摘    要:稠密轨迹的人体行为识别对每一帧全图像密集采样导致特征维数高、计算量大且包含了无关的背景信息。提出基于显著性检测和稠密轨迹的人体行为识别方法。首先对视频帧进行多尺度静态显著性检测获取动作主体位置,并与对视频动态显著性检测的结果线性融合获取主体动作区域,通过仅在主体动作区域内提取稠密轨迹来改进原算法;然后采用Fisher Vector取代词袋模型对特征编码增强特征表达充分性;最后利用支持向量机实现人体行为识别。在KTH数据集和UCF Sports数据集上进行仿真实验,结果表明改进的算法相比于原算法识别准确率有所提升。

关 键 词:人体行为识别  显著性检测  稠密轨迹  FisherVector  

Human action recognition based on dense trajectories with saliency detection
LU Tianran,YU Fengqin,YANG Huizhong,CHEN Ying.Human action recognition based on dense trajectories with saliency detection[J].Computer Engineering and Applications,2018,54(14):163-167.
Authors:LU Tianran  YU Fengqin  YANG Huizhong  CHEN Ying
Affiliation:School of Internet of Things Engineering, Jiangnan University, Wuxi, Jiangsu 214122, China
Abstract:Human action recognition based on dense trajectories samples the whole image of every frame densely, which leads to high feature dimensionality, large computational cost and containing the irrelevant background information. A human action recognition method is proposed based on dense trajectories with saliency detection. First, a multi-scale static saliency detection is used to get the action subject positions, which then is combined with the results of dynamic saliency detection to get human action areas. The original algorithm is improved by only extracting dense trajectories in these areas. To enhance adequacy of feature expression, Fisher vector is used to replace BOW model encoding the features. At last, SVM is used to get the results of human action recognition. The experimental results conducted on KTH dataset and UCF Sports dataset show that the proposed method has improved on the recognition accuracy compared with the original algorithm.
Keywords:human action recognition  saliency detection  dense trajectories  Fisher Vector  
点击此处可从《计算机工程与应用》浏览原始摘要信息
点击此处可从《计算机工程与应用》下载全文
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司    京ICP备09084417号-23

京公网安备 11010802026262号