Imagine a world where the giants of the music industry, once fierce protectors of creative rights, are now cozying up to the very AI technologies that threaten to displace artists—it's a plot twist that's both thrilling and terrifying.
The entertainment landscape is witnessing a seismic shift as major players dive headfirst into the AI arena, and it's not the underdog story they've spun for us. Take Universal Music Group (UMG), the powerhouse behind iconic labels like Warner Records and Sony Music Entertainment. Just last year, they banded together to file lawsuits against emerging AI music startups, accusing them of unauthorized use of copyrighted recordings to fuel text-to-music AI models. You can read the details in this Guardian article from June 25, 2024.
But here's where it gets controversial—fast-forward to last month, and UMG has flipped the script by inking a partnership with Udio, one of the very defendants in that lawsuit, to launch a licensed AI music creation platform. Their official press release paints a rosy picture, assuring us that they'll "do what's right" for UMG's artists. Yet, skepticism runs deep. The Music Artists Coalition, a vocal advocacy group, fired back with a pointed statement: "We've seen this movie before—talk is cheap when it comes to 'partnerships,' and artists often end up with crumbs." This echoes a broader narrative of power imbalances in the industry.
This isn't an isolated incident; it's part of a flurry of legal battles raging through US courts. Creators, publishers, and studios are contending that feeding their works into AI training constitutes blatant copyright infringement. Judges are grappling with reconciling outdated laws with a tech revolution that challenges the very essence of who gets credit for creation. For instance, in the landmark Andersen v. Stability AI case—one of the earliest class-action suits against an AI image generator—artists argue that using their art without permission, credit, or payment tramples on the rights of countless creators, as outlined in this IPWatchdog document.
There's no denying that the AI surge is hitting creative workers hardest. Generative AI is already eroding jobs in creative fields, with tangible fallout. A January 2024 survey by the Society of Authors revealed that over a third of illustrators reported income losses due to AI encroachment. Meanwhile, projections from a CISAC study estimate a 21% revenue dip for audiovisual creators by 2028—numbers that highlight the real-world toll on livelihoods.
In response, a fresh wave of activism is uniting executives and artists against the tech titans. They're leveraging social media blitzes, crowdfunded lobbying efforts, and courtroom showdowns. The Human Artistry Campaign, a coalition built on the belief that "AI can never supplant human expression and artistry," brings together creatives and industry leaders to push for laws safeguarding artists from AI and corporate giants. But—and this is the part most people miss—not everyone sees this as a straightforward good versus evil. Some artists, creators, and civil liberties advocates warn of a lurking threat: the dominance of big content itself.
Think about it—what if well-meaning artists ally with massive media conglomerates that have a history of exploiting labor, as seen in Hollywood's wage battles detailed in this Guardian piece from September 2021? These giants have also pushed to extend copyrights aggressively, often at the expense of public interest, like the issues discussed in this Duke Law and ECIPE report. While some creators rationalize this as a pragmatic "enemy of my enemy" strategy, it might backfire if big content and big tech morph from rivals into collaborators.
Dave Hansen, a copyright expert and head of the Authors Alliance, warns in this YouTube video that copyright battles won't shield artists from AI's impact. Instead, they'll pave the way for exclusive deals between media behemoths and tech firms, sidelining everyone else. History backs this up: During the streaming boom, labels and studios reaped massive profits while musicians, writers, and actors got shortchanged, as reported in PBS NewsHour and Deadline. Could AI deals follow suit? Consider the partnership between Runway and Lionsgate—United Talent Agency's CEO, Jeremy Zimmer, voiced concerns: "If I'm an actor in a Lionsgate film, and that film trains an AI model, will I see any compensation?" In multimillion-dollar agreements between publishers and AI firms, authors have been left high and dry, without pay or opt-out options, as noted in articles from The Bookseller.
Even if courts mandate payments for AI training data, everyday artists might not see a dime. Under the current power dynamics, a licensing framework could pressure creators to forfeit their rights just to keep their jobs—voice actors are already encountering this, as covered in Vice. And it won't curb big tech; giants like Google and OpenAI can foot the bill, while smaller open-source innovators get priced out. Ironically, the fight to curb big tech via copyright might just strengthen their monopoly.
Many proposed "protections" for artists could actually harm them and society. Take the NO FAKES Act, backed by major entertainment groups—it aims to create a federal right against deepfakes, those eerie AI clones of voices or faces without consent. But civil liberties watchdogs like the Center for Democracy and Technology and the ACLU criticize its murky wording, feeble free speech safeguards, and abuse potential, as in this CDT letter. The bill lets people—even kids—license their digital twins for up to 10 years (or five for minors), potentially letting studios coerce young talent into surrendering control over their own identities.
Why do these fixes flop? Because copyright suits, licensing schemes, and replica rights often serve as decoys for big content's agenda. The Copyright Alliance, a prominent group fighting for the "copyright community," pushes for strict AI copyright rules while claiming to champion individual creators. Yet its board is packed with execs from Paramount, NBC Universal, Disney, and Warner Bros.—hardly a grassroots outfit.
Skip past newsletter promotion
After newsletter promotion
But why the grand spectacle of alliances if the industry could just quietly cash in on tech deals? Because big content relies on artists: their empires need creative labor for profits, their lobbying gains credibility from artist buy-in, and their AI partners crave access to that art.
This reveals a tactic that terrifies entertainment moguls more than AI itself—empowered artists challenging the system through organized labor. Unions like the Writers Guild and SAG-AFTRA have won real victories against AI via strikes and negotiations, securing protections as reported in The Guardian and SAG-AFTRA announcements. Copyright laws are outdated relics, ill-suited for navigating the fragility of creative careers. If big content genuinely wanted to defend artists from AI, they'd stop peddling their voices as training fodder and start amplifying their voices instead.
What do you think— is allying with big content a necessary evil in the fight against AI, or a recipe for even greater exploitation? Do you believe copyright laws can evolve to protect creators, or is organized labor the true game-changer? Share your thoughts in the comments; I'd love to hear differing opinions on this contentious topic!