The Legal Battle Over AI-Generated Art
The AI-generated art movement has become one of the most debated grounds for AI development. Most artists share RJ Palmer’s concern that AI is anti-artist and rips off their creativity. I believe in the power of AI as a creative collaborator and that we need to embrace its potential. However, I see how this is not a cut-and-dry problem. A lot of pain has come from generative art.
Qualms with AI Art
The root of the problem is that human-made art is the training data for computer-generated art. In effect, the same art used to train generative models will be replaced by the computer’s art output. In other words, AI artistry is an apprentice that is trained to outshine its master.
Many artists have qualms with being raw data for machine learning, especially when their art is scraped from Google Images, Behance, Instagram, etc., without any licenses or permissions. It’s disturbing to see the style they’ve worked so hard to define be mimicked by an AI. One of the more copied artists on the web is Greg Rutkowski.
According to the website Lexica, which tracks over 10 million images and prompts generated by Stable Diffusion, Rutkowski’s name has been used as a prompt around 93,000 times. Some of the world’s most famous artists, such as Michelangelo, Pablo Picasso, and Leonardo da Vinci, brought up around 2,000 prompts each or less.
“It’s been just a month. What about in a year? I probably won’t be able to find my work out there because [the internet] will be flooded with AI art,” Rutkowski says. “That’s concerning.” – MIT Technology Review
Rutkowski’s fear of being drowned out by AI replicas is justified. One solution he proposed was that AI should exclude living artists from its database. Not a bad compromise.
But this doesn’t account for recently deceased artists, like Kim Jung Gi, who passed away in October of 2022. Jung Gi had the most unique and mesmerizing creative processes, drawing these complex and elaborate murals from memory. Before the wounds had even begun to heal from his passing, AI was already replicating him.
Just days afterward, a former French game developer, known online as 5you, fed Jung Gi’s work into an AI model. He shared the model on Twitter as an homage to the artist, allowing any user to create Jung Gi-style art with a simple text prompt.
Far from a tribute, many saw the AI generator as a theft of Jung Gi’s body of work.
The response was pure disdain. “Kim Jung Gi left us less than [a week ago] and AI bros are already ‘replicating’ his style and demanding credit,” read one viral post from the comic-book writer Dave Scheidt on Twitter. – Rest of World
Anime, as a whole, have been particularly targeted for AI models. Radius5, a Japanese AI startup, launched an art-generation beta called Mimic that targeted anime-style creators. Artists could upload their own work and customize the AI to produce images in their own illustration style. Pretty quickly, they found that people were uploading other people’s anime.
So what’s the legal ground for AI-generated art that clearly mimics an artist’s work?
The Legal Situation
Artists feel at a loss. Copyright infringement takedown requests are too manual and laborious to keep up with the rate AI can generate art. They can’t go after the platforms to remove their work from the datasets because many of these AI models are open-sourced and thus, their tech has been replicated and copied far and wide.
These are murky waters, and no one really has the best course of legal action:
- Kris Kashtanova became the first person to receive a copyright for an AI-generated work.
- Getty Images has banned all AI-generated content for fear of copyright implications.
- Shutterstock has partnered with OpenAI to bring generative models onto their site, hoping to create a fair revenue stream for artists whose work is used in AI generation.
The challenge is that this is still an evolving legal landscape with very little precedent set. And each country may come to a separate conclusion. Regarding anime imitations such as Radius5 in Japan:
Japanese law is ordinarily harsh on copyright violations. Even a user who simply retweets or reposts an image that violates copyright can be subject to legal prosecution. But with art generated by AI, legal issues only arise if the output is exactly the same, or very close to, the images on which the model is trained.
“If the images generated are identical … then publishing [those images] may infringe on copyright,” Taichi Kakinuma, an AI-focused partner at the law firm Storia. That’s a risk with Mimic, and similar generators built to imitate one artist. “Such [a result] could be generated if it is trained only with images of a particular author,” Kakinuma said.
But successful legal cases against AI firms are unlikely, said Kazuyasu Shiraishi, a partner at the Tokyo-headquartered law firm TMI Associates. – Rest of World
Regarding general generative art platforms:
In the UK, where Stability.AI is based, scraping images from the internet without the artist’s consent to train an AI tool could be a copyright infringement, says Gill Dennis, a lawyer at the firm Pinsent Masons. Copyrighted works can be used to train an AI under “fair use,” but only for noncommercial purposes.
The UK, which hopes to boost domestic AI development, wants to change laws to give AI developers greater access to copyrighted data. Under these changes, developers would be able to scrape works protected by copyright to train their AI systems for both commercial and noncommercial purposes.
While artists and other rights holders would not be able to opt out of this regime, they will be able to choose where they make their works available. The art community could end up moving into a pay-per-play or subscription model like the one used in the film and music industries.
“The risk, of course, is that rights holders simply refuse to make their works available, which would undermine the very reason for extending fair use in the AI development space in the first place,” says Dennis. – MIT Technology Review
Frankly, I don’t know that artists will get the legal reprieve or legal defenses they’re hoping for. And even if they do, it will likely come too late.
Member discussion