to find posts using karla s name to generate work that looks incredibly similar to hers, and the same is true for dozens of other artists online. so earlier this year karla and a group of other artists filed a class action lawsuit against stability ai and a group of other ai image generators. in the meantime, karla made the decision to take her work off the internet wherever she could. she figured it was the only way to avoid a computer scraping her work into an image data set without her consent. but what if she could still show her work online and keep it from being used to help generate new ai art? honestly, we just never had any idea it was such an impactful problem. this is professor ben zhao, from the university of chicago. he and his lab say they have developed a solution. they call it glaze. at its core, glaze uses the fact that there is this ginormous gap, difference between the way humans see visual images and how learning
that her art had been scraped into an ai image data set. especially my fine art work, and that to me felt really invasive, because i had never given anyone my permission to do that. 0n midjourney, another popular generator, it s incredibly easy to find posts using karla s name to generate work that looks incredibly similar to hers, and the same is true for dozens of other artists online. so earlier this year karla and a group of other artists filed a class action lawsuit against stability ai and a group of other ai image generators. in the meantime, karla made the decision to take her work off the internet wherever she could. she figured it was the only way to avoid a computer scraping her work into an image data set without her consent. but what if she could still show her work online and keep it from being used to help generate new ai art? honestly, we just never had any idea it was such an impactful problem. this is professor ben chow,
0n midjourney, another popular generator, it s incredibly easy to find posts using karla s name to generate work that looks incredibly similar to hers, and the same is true for dozens of other artists online. so earlier this year karla and a group of other artists filed a class action lawsuit against stability ai and a group of other ai image generators. in the meantime, karla made the decision to take her work off the internet wherever she could. she figured it was the only way to avoid a computer scraping her work into an image data set without her consent. but what if she could still show her work online and keep it from being used to help generate new ai art? honestly, we just never had any idea it was such an impactful problem. this is professor ben zhao, from the university of chicago. he and his lab say they have developed a solution. they call it glaze. at its core, glaze uses the fact
to find posts using karla s name to generate work that looks incredibly similar to hers, and the same is true for dozens of other artists online. so earlier this year karla and a group of other artists filed a class action lawsuit against stability ai and a group of other ai image generators. in the meantime, karla made the decision to take her work off the internet wherever she could. she figured it was the only way to avoid a computer scraping her work into an image data set without her consent. but what if she could still show her work online and keep it from being used to help generate new ai art? honestly, we just never had any idea it was such an impactful problem. this is professor ben zhao, from the university of chicago. he and his lab say they have developed a solution. they call it glaze. at its core, glaze uses the fact that there is this ginormous gap, difference between the way humans see visual images and how learning