AI image generators have been causing a stir in the creative industry for quite some time now, but the main issue is how these tools are undertaking mass copyright infringement without consequence – at least up until now.
Jake Watson, an attorney from California, has spoken out about what the legalities are surrounding how AI image generators are trained using copyrighted works within data sets, and sheds some light on the recent Stability AI lawsuits.
• These are the best AI image generators in 2023 (opens in new tab)
A class action lawsuit was filed by Getty Images earlier this month against Stability AI, the creator of the popular AI image generator Stable Diffusion / DreamStudio, for acts of infringed intellectual property rights, and generating copied works without consent.
An additional and separate lawsuit has also been filed by a small group of artists against Stability AI, DeviantArt, and Midjourney, in relation to their use and incorporation of Stable Diffusion's AI.
Watson shared his own insight on the courtroom complexities of the situation in a YouTube video, where he describes the "mindbogglingly massive" legal scale of this case. You can watch the video below:
Watson goes into a lot of detail during this 16-minute long video, which is great, but to break things down for those who may not have time to watch it all: the suit essentially claims that around 6 billion training images (240 terabytes worth) were used by Stability AI in a web-scrape, without obtaining any consent from the copyright owners or the host website operators.
The interesting part is that Stability AI does not deny using these copyrighted images to train its AI, but suggests that the output is completely new content and that the training images are actually not part of the distributed model. So in simple terms, yes, the company did commit copyright infringement according to Watson – but it has a strong argument to fall back on.
It is understood and argued by the plaintiffs that Stability AI created latent images from the 6 billion assets (copyrighted training images) that were then developed through a process of machine learning where the AI can construct its own visually reconstructed copy of the original material.
Watson suggests that the strongest defensive argument is that generated works from Stable Diffusion are actually derivative works of (based on or derived from) the training (original copyrighted) images, that are manipulated through conditioning and interpolation. The final resulting AI image is still completely reliant upon the latent images, which are exclusively derived from the copyrighted training images.(opens in new tab)
There's something in law called the Adaptation Right, which gives the copyright owner exclusive rights to make, or to authorize the making of, derivative works from their original content. There is also, however, something known as transformative work, which is considered to be changing the original nature of the work to such a degree that the usage no longer qualifies as infringing if it serves an entirely new purpose of its own. A parody is a good example, constituting a 'fair use' policy.
The important distinction between transformative and derivative works is that transformative ones do not require permission or consent from the copyright owner, whereas derivative ones do. Watson believes that Stability AI might have a good basis to argue fair use of the training images in court, based on other similar cases that ruled out copyright over highly transformative natures.
There's a strong chance that Getty and the team of artists may lose the fight against Stability AI but, as Watson explains, only good things can come out of these lawsuits that bring discussions into the light to improve the coexistence between technology, photography, and digital art.
• You may also be interested in the best noise reduction softwares (opens in new tab), as well as our review of the Topaz Labs DeNoise AI (opens in new tab)and the Topaz Labs Sharpen AI (opens in new tab) software, and discover the Top 10 AI tools in Photoshop (opens in new tab).