Futures

New Tool Nightshade Allows Artists to Fight Back Against Generative AI, from (20231029.)

External link

Summary

A new data poisoning tool called Nightshade allows artists to fight back against generative AI models. This tool messes up training data, causing damage to image-generating AI models. Nightshade enables artists to add invisible changes to their art before uploading it online, which can disrupt AI models if they are scraped into training sets. The tool aims to address the issue of AI companies using artists’ work without permission. Nightshade can damage future iterations of AI models, such as DALL-E, Midjourney, and Stable Diffusion. The tool has been developed by a team led by Ben Zhao and is intended to tip the power balance back towards artists. It is designed to be integrated into Glaze, another tool developed by the team. Nightshade is open source, allowing others to use and modify it. The tool exploits a security vulnerability in generative AI models, manipulating images to cause malfunctions. While there is a risk of malicious use, attackers would need thousands of poisoned samples to impact larger models trained on billions of data samples. The research highlights the need for defenses against poisoning attacks on modern machine learning models. The work has been praised for emphasizing vulnerabilities and the importance of artists’ rights. Nightshade could potentially influence AI companies to respect artists’ rights more and consider paying royalties. It provides artists with the power to protect their work and could change the status quo.

Keywords

Themes

Signals

Signal Change 10y horizon Driving force
New data poisoning tool for artists Artists fighting back against AI AI models produce unpredictable outputs Protecting artists’ copyright and IP
Nightshade tool sabotages AI training Damage to image-generating AI models AI companies respecting artists’ rights Deterrence against unauthorized use of artwork
Glaze tool masks artists’ personal style Prevents AI companies from scraping artists’ work Artists have more control over their artwork Protecting artists’ copyright and IP
Nightshade integrated into Glaze Artists can choose to use data-poisoning tool More powerful and widespread use of the tool Empowerment of artists
Vulnerability in generative AI models Risk of poisoning attacks on models Need for robust defenses against attacks Urgency to develop defenses for AI models
Nightshade as a powerful deterrent AI companies respect artists’ rights More willingness to pay royalties Shifting power dynamics in the creative industry

Closest