Availability Attacks against Neural Networks
Schneier on Security 2020-06-10
Summary:
New research on using specially crafted inputs to slow down machine-learning neural network systems: Sponge Examples: Energy-Latency Attacks on Neural Networks shows how to find adversarial examples that cause a DNN to burn more energy, take more time, or both. They affect a wide range of DNN applications, from image recognition to natural language processing (NLP). Adversaries might use these...
Link:
https://www.schneier.com/blog/archives/2020/06/availability_at.htmlFrom feeds:
Gudgeon and gist » Schneier on SecurityBerkman Center Community - Test » Schneier on Security