Now you can "poison" your images so they wreak havoc on AI generators

Poison bottles
(Image credit: Getty Images)

AI-generated images often exploit artists' work without consent or compensation. It’s been an ongoing battle ever since the likes of DALL•E and Midjourney took off back in 2021, but creators might have found an unlikely ally in Nightshade.

This new tool gives creators the power to add invisible alterations to their artwork that’ll cause chaotic and unreliable results when used to train AI models. Developed by a team led by Ben Zhao at the University of Chicago, Nightshade is poised to tip the balance of power back in favor of artists and creators.

AI image generators have caused nothing but controversy since they burst onto the scene. Once upon a time, a photo was undeniably a photo – taken with a camera and composed by a human – but AI can now output such realistic-looking images that it’s hard to differentiate between a computer and organically generated content.

One of the big problems with AI image generation is that AI models are trained using images found on the internet – many of which have been created by artists who have not given their permission for their work to be used. Nightshade's main purpose, as reported by VentureBeat, is to tackle that very issue.

By intentionally "poisoning" the training data, Nightshade can disrupt the functionality of future iterations of image-generating AI models, resulting in peculiar outcomes. Dogs could become cats, cats could become mice, and mice may appear as men. OpenAI, Meta and Stability are just some of the companies that will be affected by this new tool, and in the past have faced lawsuits from artists claiming copyright infringement for using intellectual property without permission.

Nightshade capitalizes on a security vulnerability in generative AI models, which often rely on vast amounts of data collected from the internet. The tool subtly manipulates the pixels of images, rendering them different to machine-learning models while remaining indistinguishable to the human eye. This technique disrupts the models' understanding, causing them to interpret the images in erratic ways.

The creators of Nightshade have committed to making their tool open source, which means users will be able to customize it and develop their own versions, to help strengthen the tool. Given the vast size of data sets used by large AI models, the more poisoned images that infiltrate the data, the greater the disruption the technique can cause.

Once poisoned samples infiltrate an AI model's data set, they can cause lasting damage. Removing these corrupted samples is a laborious process, and it becomes even more complex as the poison influences not only the targeted word but also similar concepts and tangentially related images (for example, "dog", "woof" and "puppy" would all be affected). 

While Nightshade presents a powerful tool for creators, there is concern that data poisoning techniques could be used maliciously. Due to the way Nightshade works, such attacks would require thousands of poisoned samples to inflict serious damage on powerful AI models that use billions of data samples in training. Researchers emphasize the importance of working on defenses against such attacks, but robust solutions are yet to emerge.

There are plans to integrate Nightshade with Glaze – another tool created by Zhao and his team, which masks an image and feeds AI models inaccurate data. These tools show a remarkable advancement in the protection of artists' rights by providing creators with the means to safeguard their work from unauthorized use by AI companies. 

The open-source nature of Nightshade promises to grow its impact and empower more creators to protect their artistic creations. But as this type of innovative technology evolves, so do the challenges and responsibilities in ensuring it is used responsibly and ethically. You can read more about Nightshade here

Check out the best photo editing software – now with lots of helpful AI tools to make editing easier.

Hannah Rooke
Freelance contributor

Having studied Journalism and Public Relations at the University of the West of England Hannah developed a love for photography through a module on photojournalism. She specializes in Portrait, Fashion and lifestyle photography but has more recently branched out in the world of stylized product photography. Hannah spent three years working at Wex Photo Video as a Senior Sales Assistant, using her experience and knowledge of cameras to help people buy the equipment that is right for them. With eight years experience working with studio lighting, Hannah has run many successful workshops teaching people how to use different lighting setups.