首页 | 本学科首页   官方微博 | 高级检索  
     


Thresholding binary coding for image forensics of weak sharpening
Affiliation:1. Department of Computer and Software, Nanjing University of Information Science and Technology, Nanjing, China;2. Nanjing University of Information Science and Technology, Nanjing, China
Abstract:Image forensics of sharpening has aroused the great interest of researchers in recent decades. The state-of-the-art techniques have achieved high accuracies of strong sharpening detection, while it remains a challenge to detect weak sharpening. This paper proposes an algorithm based on thresholding binary coding for image sharpening detection. The overshoot artifact introduced by sharpening enlarges the difference between the local maximum and minimum of both image pixels and unsharp mask elements, based on which the threshold local binary pattern operator is applied to capture the trace of sharpening. Then the patterns are coded according to the rotation symmetry invariance and the texture type. Features are extracted from the statistical distribution of the coded patterns and fed to the classifier for sharpening detection. In practice, two classifiers are constructed for the lightweight and offline applications respectively, one is a single Fisher linear discriminant (FLD) with 182 features, and the other is an ensemble classifier (EC) with 5460 features. The experimental results on BOSS, NRCS and RAISE datasets show that for weak sharpening detection, the FLD outperforms the CNN and SVMs with EPTC, EPBC, and LBP features, and using EC with TBCs features further improves the performance, which obtains better results than ECs with TLBP and SRM features. Besides, the proposed algorithm is robust to post-JPEG compression and noise addition and could differentiate sharpening from other manipulations.
Keywords:Image forensics  Texture pattern mapping  Thresholding binary coding  Unsharp mask  Weak sharpening
本文献已被 ScienceDirect 等数据库收录!
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号