The University of Chicago’s Glaze Project has released Nightshade v1.0, which enables artists to sabotage generative AI models that ingest their work for training. Nightshade makes invisible pixel-level changes to images that trick AI models into reading them as something else and corrupt their image output for example, identifying a cubism style as cartoon. It’s out now for Windows PC and Apple Silicon Macs.
New Data-Poisoning Tool Aims to Help Artists Fight Back Against AI ign.com - get the latest breaking news, showbiz & celebrity photos, sport news & rumours, viral videos and top stories from ign.com Daily Mail and Mail on Sunday newspapers.
Worried about AI hijacking your voice for a deepfake? This tool could help wcsufm.org - get the latest breaking news, showbiz & celebrity photos, sport news & rumours, viral videos and top stories from wcsufm.org Daily Mail and Mail on Sunday newspapers.