New tool ‘Nightshade’ helps artists protect their work and corrupt AI data sets: Report

The tool, known as Nightshade, alters images so that they look normal to human viewers but confuse AI models and lead to incorrect results

Published - October 31, 2023 02:06 pm IST

Nightshade comes from the laboratory of Ben Zhao, a professor at the University of Chicago [File]

Nightshade comes from the laboratory of Ben Zhao, a professor at the University of Chicago [File] | Photo Credit: AP

A new tool known as Nightshade could help artists protect their copyrighted work from being used for AI model training, by corrupting the systems which scrape media data without creator consent, reported the MIT Technology Review earlier this month.

Nightshade comes from the laboratory of Ben Zhao, a professor at the University of Chicago. The digital tool would allow artists to modify the pixels of their pieces so that the art still looks the same to human viewers, but would confuse AI systems that use it for training. MIT shared photos showing “poisoned” image generation models which resulted in warped pictures of everyday objects. These would likely frustrate future end users of the model - especially if they have paid for enterprise versions of it.

Multiple pending lawsuits against companies such as ChatGPT-maker OpenAI, Meta, Stability AI, and Google allege that these firms illegally harvested copyrighted works such as novels and art from creators to build generative AI systems like chatbots or text-to-image generators without seeking the creators’ permission or paying them for use.

OpenAI has defended the use of copyrighted media for innovative purposes and scientific progress, claiming this is protected under the fair use doctrine.

(For top technology news of the day, subscribe to our tech newsletter Today’s Cache)

Still, in the face of backlash, more tech firms are allowing people to opt out of such training regimes. However, Nightshade would go a step further by enabling artists to damage the data sets of AI companies which use their copyrighted pieces without permission.

Zhao’s team is also behind a similar tool called Glaze, which helps artists by again confusing AI models.

0 / 0
Sign in to unlock member-only benefits!
  • Access 10 free stories every month
  • Save stories to read later
  • Access to comment on every story
  • Sign-up/manage your newsletter subscriptions with a single click
  • Get notified by email for early access to discounts & offers on our products
Sign in

Comments

Comments have to be in English, and in full sentences. They cannot be abusive or personal. Please abide by our community guidelines for posting your comments.

We have migrated to a new commenting platform. If you are already a registered user of The Hindu and logged in, you may continue to engage with our articles. If you do not have an account please register and login to post comments. Users can access their older comments by logging into their accounts on Vuukle.