A Multimodal Adversarial Attack Framework Based on Local and Random Search Algorithms

Although many problems in computer vision and natural language processing have made breakthrough progress with neural networks, adversarial attack is a serious potential problem in many neural network- based applications. Attackers can mislead classifiers with slightly perturbed examples, which are...

Full description

Bibliographic Details
Main Authors: Zibo Yi, Jie Yu, Yusong Tan, Qingbo Wu
Format: Article
Language:English
Published: Atlantis Press 2021-07-01
Series:International Journal of Computational Intelligence Systems
Subjects:
Online Access:https://www.atlantis-press.com/article/125958419/view
id doaj-764e1ca9a5274450b573231740ff87dd
record_format Article
spelling doaj-764e1ca9a5274450b573231740ff87dd2021-07-18T05:49:45ZengAtlantis PressInternational Journal of Computational Intelligence Systems 1875-68832021-07-0114110.2991/ijcis.d.210624.001A Multimodal Adversarial Attack Framework Based on Local and Random Search AlgorithmsZibo YiJie YuYusong TanQingbo WuAlthough many problems in computer vision and natural language processing have made breakthrough progress with neural networks, adversarial attack is a serious potential problem in many neural network- based applications. Attackers can mislead classifiers with slightly perturbed examples, which are called adversarial examples. As the existing adversarial attacks are specific to application and have difficulty in general usage, we propose a multimodal adversarial attack framework to attack both text and image classifiers. The proposed framework firstly generates candidate set to find the substitution words or pixels and generate candidate adversarial examples. Secondly, the framework updates candidate set and search adversarial examples with three local or random search methods [beam search, genetic algorithm (GA) search, particle swarm optimization (PSO) search]. The experiments demonstrate that the proposed framework effectively generates image and text adversarial examples. Comparing the proposed methods with other image adversarial attacks in MNIST dataset, the PSO search in the framework has 98.4% attack success rate which outperforms other methods. Besides, the beam search has the best attack efficiency and human imperception in both MNIST and CIFAR-10 dataset. Comparing with other text adversarial attacks, the beam search in the framework has an attack success rate of 91.5%, which outperforms other existing and the proposed search methods. In attack efficiency, the beam search also outperforms other methods, meaning that we can craft text adversarial examples with less perturbation using beam search.https://www.atlantis-press.com/article/125958419/viewAdversarial attackMultimodal applicationsAdversarial imageAdversarial textLocal searchRandom search
collection DOAJ
language English
format Article
sources DOAJ
author Zibo Yi
Jie Yu
Yusong Tan
Qingbo Wu
spellingShingle Zibo Yi
Jie Yu
Yusong Tan
Qingbo Wu
A Multimodal Adversarial Attack Framework Based on Local and Random Search Algorithms
International Journal of Computational Intelligence Systems
Adversarial attack
Multimodal applications
Adversarial image
Adversarial text
Local search
Random search
author_facet Zibo Yi
Jie Yu
Yusong Tan
Qingbo Wu
author_sort Zibo Yi
title A Multimodal Adversarial Attack Framework Based on Local and Random Search Algorithms
title_short A Multimodal Adversarial Attack Framework Based on Local and Random Search Algorithms
title_full A Multimodal Adversarial Attack Framework Based on Local and Random Search Algorithms
title_fullStr A Multimodal Adversarial Attack Framework Based on Local and Random Search Algorithms
title_full_unstemmed A Multimodal Adversarial Attack Framework Based on Local and Random Search Algorithms
title_sort multimodal adversarial attack framework based on local and random search algorithms
publisher Atlantis Press
series International Journal of Computational Intelligence Systems
issn 1875-6883
publishDate 2021-07-01
description Although many problems in computer vision and natural language processing have made breakthrough progress with neural networks, adversarial attack is a serious potential problem in many neural network- based applications. Attackers can mislead classifiers with slightly perturbed examples, which are called adversarial examples. As the existing adversarial attacks are specific to application and have difficulty in general usage, we propose a multimodal adversarial attack framework to attack both text and image classifiers. The proposed framework firstly generates candidate set to find the substitution words or pixels and generate candidate adversarial examples. Secondly, the framework updates candidate set and search adversarial examples with three local or random search methods [beam search, genetic algorithm (GA) search, particle swarm optimization (PSO) search]. The experiments demonstrate that the proposed framework effectively generates image and text adversarial examples. Comparing the proposed methods with other image adversarial attacks in MNIST dataset, the PSO search in the framework has 98.4% attack success rate which outperforms other methods. Besides, the beam search has the best attack efficiency and human imperception in both MNIST and CIFAR-10 dataset. Comparing with other text adversarial attacks, the beam search in the framework has an attack success rate of 91.5%, which outperforms other existing and the proposed search methods. In attack efficiency, the beam search also outperforms other methods, meaning that we can craft text adversarial examples with less perturbation using beam search.
topic Adversarial attack
Multimodal applications
Adversarial image
Adversarial text
Local search
Random search
url https://www.atlantis-press.com/article/125958419/view
work_keys_str_mv AT ziboyi amultimodaladversarialattackframeworkbasedonlocalandrandomsearchalgorithms
AT jieyu amultimodaladversarialattackframeworkbasedonlocalandrandomsearchalgorithms
AT yusongtan amultimodaladversarialattackframeworkbasedonlocalandrandomsearchalgorithms
AT qingbowu amultimodaladversarialattackframeworkbasedonlocalandrandomsearchalgorithms
AT ziboyi multimodaladversarialattackframeworkbasedonlocalandrandomsearchalgorithms
AT jieyu multimodaladversarialattackframeworkbasedonlocalandrandomsearchalgorithms
AT yusongtan multimodaladversarialattackframeworkbasedonlocalandrandomsearchalgorithms
AT qingbowu multimodaladversarialattackframeworkbasedonlocalandrandomsearchalgorithms
_version_ 1721296568694341632