MAP: Masked Adversarial Perturbation for Boosting Black-Box Attack Transferability

  • Kaige Li
  • , Maoxian Wan
  • , Qichuan Geng*
  • , Weimin Shi
  • , Xiaochun Cao
  • , Zhong Zhou*
  • *Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

Abstract

The transferability of adversarial examples is vital for black-box attacks, as it enables the adversary to deceive the target model without knowing its internals. Despite numerous methods focusing on transferability, they still struggle with transferring across models with distinct architectural components (e.g., CNNs and ViTs). In this work, we argue that the limited adversarial perturbation diversity leads to overfitting of the surrogate model, which acts as a key factor in reducing transferability. To this end, we propose a Masked Adversarial Perturbation (MAP) method to boost adversarial transferability across various architectures from a novel perspective of diversifying perturbation. Specifically, MAP randomly masks perturbation patches during iterations and compels the remaining ones to retain the attack effect, which diversifies perturbations to mitigate their overfitting to the surrogate model. Naturally, MAP spreads perturbation over local patches to alleviate their co-adaptation and prevent perturbations from overly relying on specific patterns. Consequently, it can deceive convolution operation and self-attention mechanism indiscriminately by attacking their basic input units, i.e., a single patch, showing superior transferability over previous methods. Extensive experiments illustrate that MAP consistently and significantly boosts diverse black-box attacks to achieve state-of-the-art performance.

Original languageEnglish
Pages (from-to)4426-4439
Number of pages14
JournalIEEE Transactions on Image Processing
Volume34
DOIs
StatePublished - 2025

Keywords

  • Adversarial examples
  • adversarial transferability
  • black-box attack
  • masked perturbation

Fingerprint

Dive into the research topics of 'MAP: Masked Adversarial Perturbation for Boosting Black-Box Attack Transferability'. Together they form a unique fingerprint.

Cite this