Artists can now use this data 'poisoning' tool to fight back

Artists can now use this data 'poisoning' tool to fight back against AI scrapers.

The University of Chicago’s Glaze Project has released Nightshade v1.0, which enables artists to sabotage generative AI models that ingest their work for training. Nightshade makes invisible pixel-level changes to images that trick AI models into reading them as something else and corrupt their image output — for example, identifying a cubism style as cartoon. It’s out now for Windows PC and Apple Silicon Macs.

Related Keywords

, University Of Chicago Glaze , Glaze Project , Apple Silicon ,

© 2025 Vimarsana