A new data "poisoning" tool allows artists to fight against generative artificial intelligence systems trained with unlicensed works by corrupting images.The tool, Nightshade, was created to help artists protect their unique work from AI companies seeking to train their models on unlicensed images. .
Nightshade protects artists from AI | Inquirer Technology inquirer.net - get the latest breaking news, showbiz & celebrity photos, sport news & rumours, viral videos and top stories from inquirer.net Daily Mail and Mail on Sunday newspapers.
The tool, called Nightshade, could be used to damage future iterations of image-generating AI models, leveling the playing field between AI companies and artists.