CAMA: Class activation mapping disruptive attack for deep neural networks

Sainan Sun, Bin Song, Xiaohui Cai, Xiaojiang Du, Mohsen Guizani

Research output: Contribution to journalArticlepeer-review

5 Scopus citations

Abstract

The emergence of adversarial examples has aroused widespread attention to the safety of deep learning. Most recent research focuses on how to obtain adversarial examples which make networks’ predictions wrong, and rarely observe the changes in feature embedding space from the perspective of interpretability. In addition, researchers have proposed various attack algorithms for a single task, but there are few general methods that can perform multiple tasks at the same time, such as image classification, object detection, and face recognition. To resolve these issues, we propose a new attack algorithm CAMA for deep neural networks (DNNs). CAMA perturbs each feature extraction layer through adaptive feature measurement function, thereby disrupting the predicted class activation mapping of DNNs. Experiments show that CAMA is good at creating white-box adversarial examples on classification networks and has the highest attack success rate. To solve the problem of the disappearance of aggression caused by image transformation, we propose spread-spectrum compression CAMA, which achieve a better attack success rate under various defensive measures. In addition, we successfully attack face recognition networks and object detection networks using CAMA, and achieve excellent performance. It verifies that our algorithm is a general attack algorithm for attacking different tasks.

Original languageEnglish
Pages (from-to)989-1002
Number of pages14
JournalNeurocomputing
Volume500
DOIs
StatePublished - 21 Aug 2022

Keywords

  • Adversarial attack
  • Deep neural networks
  • Image classification
  • Multi-task attack
  • White-box attack

Fingerprint

Dive into the research topics of 'CAMA: Class activation mapping disruptive attack for deep neural networks'. Together they form a unique fingerprint.

Cite this