University of Chicago program Nightshade 'poisons' digital art to prevent generative AI replication

Jason Knowles Image
Friday, January 26, 2024
Nightshade 'poisons' digital art to prevent generative AI replication
A program called Nightshade developed at the University of Chicago poisons digital art so that generative AI cannot replicate it.

CHICAGO (WLS) -- You may use AI to help you write a letter or an email, but did you know it can also be used to recreate someone's art?

Da Vinci's Mona Lisa is one of the world's most famous paintings and can be found all over the internet, making it easy for artificial intelligence to copy and mimic Da Vinci's style and work.

But what if the image is poisoned? AI could try to copy, but it would end up completely wrong.

"We need a defensive tool like this to protect our work, to protect our look, and to maintain our place in the market," said artist Steven Zapata.

This defensive tool is called Nightshade. Developed by University of Chicago Professor Ben Zhao and his team of PhD students, nightshade poisons image data so that AI misinterprets and is unable to reproduce the image.

Artists and creatives have been concerned about Generative AI replicating and stealing their work, artistic style and potentially their careers.

Professor Zhao and his team hope Nightshade will protect artists and their creative industries.

"AI models observe or extract features or patterns from images in ways that are very different from our own visual cortex," Zhao said. "And so what we can do is we can design mathematical functions that leverage that. So we can, um, compute ways to make AI models see something completely differently while not changing much, if at all, of what humans actually see of the art."

"I think we're trying to alter the landscape of how AI models get training data. Uh, and of course, the goal is to make it so expensive by scraping and getting potentially poisoned data that AI companies will actually go out and pay artists and real people to license their actual content, just like you would in any other industry. So in that respect, it actually goes beyond individual artists," he added.

Some companies with generative AI, or AI that produces content, have been replicating or taking artists' work and style without their permission.

Zappata is concerned his art may have been copied by AI.

"The status quo with these models is that we all woke up one day in 2022 to find out, like, oh, I'm probably in this. And now I have to compete against this thing and it's devaluing my whole career," he said.

Nightshade will be released to the public this month. The University of Chicago will be making it available for individual artists. The university said a lot of companies are reaching out about it but they haven't decided how it'll be released for those businesses.

Copyright © 2024 WLS-TV. All Rights Reserved.