Now you can "poison" your images so they wreak havoc on AI generators

Poison bottles
(Image credit: Getty Images)

AI-generated images often exploit artists' work without consent or compensation. It’s been an ongoing battle ever since the likes of DALL•E and Midjourney took off back in 2021, but creators might have found an unlikely ally in Nightshade.

This new tool gives creators the power to add invisible alterations to their artwork that’ll cause chaotic and unreliable results when used to train AI models. Developed by a team led by Ben Zhao at the University of Chicago, Nightshade is poised to tip the balance of power back in favor of artists and creators.

Thank you for reading 5 articles this month* Join now for unlimited access

Enjoy your first month for just £1 / $1 / €1

*Read 5 free articles per month without a subscription

Join now for unlimited access

Try first month for just £1 / $1 / €1

Hannah Rooke
Staff Writer

Having studied Journalism and Public Relations at the University of the West of England Hannah developed a love for photography through a module on photojournalism. She specializes in Portrait, Fashion and lifestyle photography but has more recently branched out in the world of stylized product photography. For the last 3 years Hannah has worked at Wex Photo Video as a Senior Sales Assistant using her experience and knowledge of cameras to help people buy the equipment that is right for them. With 5 years experience working with studio lighting, Hannah has run many successful workshops teaching people how to use different lighting setups.