Availability Attacks against Neural Networks

Schneier on Security 2020-06-10

Summary:

New research on using specially crafted inputs to slow down machine-learning neural network systems: Sponge Examples: Energy-Latency Attacks on Neural Networks shows how to find adversarial examples that cause a DNN to burn more energy, take more time, or both. They affect a wide range of DNN applications, from image recognition to natural language processing (NLP). Adversaries might use these...

Link:

https://www.schneier.com/blog/archives/2020/06/availability_at.html

From feeds:

Gudgeon and gist » Schneier on Security
Berkman Center Community - Test » Schneier on Security

Tags:

academicpapers

Authors:

Bruce Schneier

Date tagged:

06/10/2020, 14:36

Date published:

06/10/2020, 07:31