首页 | 本学科首页   官方微博 | 高级检索  
     


A CMA-ES-Based Adversarial Attack Against Black-Box Object Detectors
Authors:LYU Haoran  TAN Yu'an  XUE Yuan  WANG Yajie  XUE Jingfeng
Abstract:Object detection is one of the essential tasks of computer vision. Object detectors based on the deep neural network have been used more and more widely in safe-sensitive applications, like face recognition, video surveillance, autonomous driving, and other tasks. It has been proved that object detectors are vulnerable to adversarial attacks. We propose a novel black-box attack method, which can successfully attack regression-based and region-based object detectors. We introduce methods to reduce search dimensions, reduce the dimension of optimization problems and reduce the number of queries by using the Covariance matrix adaptation Evolution strategy (CMA-ES) as the primary method to generate adversarial examples. Our method only adds adversarial perturbations in the object box to achieve a precise attack. Our proposed attack can hide the specified object with an attack success rate of 86% and an average number of queries of 5, 124, and hide all objects with a success rate of 74%and an average number of queries of 6, 154. Our work illustrates the effectiveness of the CMA-ES method to generate adversarial examples and proves the vulnerability of the object detectors against the adversarial attacks.
Keywords:Adversarial example  Black-box attack  Object detector  Deep neural network
本文献已被 万方数据 等数据库收录!
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号