Futures

Nightshade Tool Receives Surprising Response, from (20240218.)

External link

Summary

Nightshade, a new tool developed by computer science researchers at the University of Chicago, has gained significant traction with 250,000 downloads in just five days since its release. The tool is designed to disrupt AI models that scrape and train on artists’ works without consent. Nightshade “poisons” generative AI image models by altering artworks on a pixel level, making them appear to contain different content. The creators also developed Glaze, a tool that prevents AI models from learning an artist’s signature style. The team plans to release a combined version of Glaze and Nightshade in the future.

Keywords

Themes

Signals

Signal Change 10y horizon Driving force
Nightshade receives 250,000 downloads in 5 days From low adoption to high adoption of Nightshade tool More artists using Nightshade to protect their work Artists’ desire to protect their work from unauthorized use
Nightshade alters artworks to disrupt AI models From AI models training on unlicensed data to increased cost of training Increased licensing of images from creators Make licensing images from creators a viable alternative
Glaze/Nightshade team plans to release combined tool From separate defensive and offensive tools to combined tool Comprehensive tests done to ensure no surprises Need to carefully test combined tool to avoid surprises
Artists using both Glaze and Nightshade From using one tool to using two tools Protecting style while disrupting AI model training Desire to protect style while disrupting AI model training
Possibility of open-source version of Nightshade From closed-source to open-source Nightshade More flexibility and customization for users More time required to develop different versions

Closest