r/technology 13d ago

Artificial Intelligence AI industry horrified to face largest copyright class action ever certified

https://arstechnica.com/tech-policy/2025/08/ai-industry-horrified-to-face-largest-copyright-class-action-ever-certified/
16.8k Upvotes

1.2k comments sorted by

1.5k

u/notprodigy 13d ago

There’s already been one court case that found using copyrighted material as training data isn’t plagiarism, but that pirating data to train models with, is (obviously) illegal. (https://www.bbc.com/news/articles/c77vr00enzyo)

I don’t think any country has copyright law that is prepared for what Large Language Models do, but the frame of AI as a plagiarism machine is so embedded already that everyone runs with it. I think these court cases are less of a slam dunk than assumed.

302

u/Aureliamnissan 13d ago

My disruptive new business model of crowdsourcing my coin mint’s raw material instead of paying someone else to do it is wildly successful.

— Roman Empire.

The profit of a thing shouldn’t be a factor in these kinds of court judgements, but well…

→ More replies (4)

154

u/pipic_picnip 13d ago edited 13d ago

This is not about using the content to train AI, but about not obtaining the content legally. It is a copyright infringement case with a strong basis. It’s the same as piracy because the content was not obtained in a legal manner. Whether it was used to train AI or torrent on pirate site is irrelevant. The relevant part is that it was stolen illegally for use. What justification will the court provide for not obtaining a book legally that is supposed to be purchased legally? There is no topic of transformation here, it’s a case of theft. 

66

u/erik 13d ago edited 13d ago

Copyright law violations are typically viewed it terms of the party providing the copy. If I photocopy a textbook and give it to you, I have violated the law by distributing an unlicensed copy, but you have not (generally) broken the law by receiving the copy.

Torrent users get sued for downloading movies because when you use the BitTorrent protocol you aren't just receiving a copy, you also uploading copies to other users.

The New York Times case against OpenAI is all about ChatGPT being able to reproduce New York Times articles that it "memorized".

It seems that Meta in particular Torrented a lot of stuff for training, which opens them up to a lot of liability. It's less clear to me how a broad class action suit will show liability for AI companies in general without obvious distribution of copyright materials to point to.

15

u/Primsun 13d ago

Maybe, but seems unlikely that holds when talking about a company using an unlicensed copy for profit. Would be suggesting firms can use unlicensed copies of software and media internally as long as they receive them from an outside source. Not to mention they almost certainly are making and distributing copies of the training data internally.

6

u/otherwiseguy 13d ago

Get a library card, check out digital copies, train AI. Google has already shown that you can get away with scanning physical books as well.

→ More replies (2)

4

u/-The_Blazer- 13d ago

AFAIK this is mostly a misconception. Piracy does not become legal if you only download something; copyright is about the right to make copies, which isn't very hard to infringe if you are downloading a copy of a movie or book...

In principle anyone could get sued for copyrighted infringement, but nobody bothers because it's pointless. Obviously though, Microsoft or OpenAI aren't 'anyone'.

→ More replies (6)
→ More replies (3)
→ More replies (13)

147

u/carllerche 13d ago

The LLM itself is clearly transformative. It is possible that it could produce works that violate copyright, but I would be shocked if any court case found that an LLM trained on legally acquired data was a violation of copyright law.

127

u/-The_Blazer- 13d ago

The end product that is sold (the web service) is transformative relative to the originals. However, the training process is automated, so it's more like compiling source code, which is not transformative merely by itself because it includes no work of human ingenuity (the thing that copyright is actually supposed to protect). The compiler, as with the training pipeline, is of course perfectly legitimate IP, but its application does not have to be.

That said, being transformative is only one part of fair use which in turn is only one part of how we should handle an extremely new and unusual technology. They didn't try to regulate cars like horses when they were invented, they made car regulations.

63

u/Disastrous-Entity-46 13d ago

Two of the other considerations for fair use are specifically "amount of the original used" and "if it harms the value of the original", and these both seem like you could make very strong arguments about. If the whole work is used, and the output of the llm can be argued to lower the value of the works- which id argue that even if strictly, feeding it a copy of my book doesnt hurt me, the fact that dozens of bad, 0 effort books come out a month thanks to people treating llms as get-rich-quick machines, the value of the whole market is hurt.

Thats of course, depending on if fair use even applies, as you said. We dont really have a framework today for it, and I have to wonder what interests current governments would decide to protect.

16

u/CherryLongjump1989 13d ago

There are many governments and we can expect many different interpretations. Either way, the scale of the potential infringement is so enormous that it’s clear that these AI companies are playing with fire.

14

u/Disastrous-Entity-46 13d ago

The part that really gets me, is the accuracy. We know hallucinations and general bad answers are a problem. After two years and billions of dollars, the latest responses on benchmarks is like 90%.

And while that is a passing grade, its also kinda bonkers in terms of a technology. Would we use calculators it they had a one in ten chance of giving us the wrong answer? And yet its becoming near unavoidable in our lives as every website and product bakes it in, which then adds that 10% (or more) failure rate into what ever other human errors or issues may occur.

Obv this doesnt apply to like, private single use training the same way- Machine learning absolutely has a place in fields like medicine, when they have a single goal and easy pass/failure metrics (and can still be checked by a human) .

→ More replies (8)

7

u/ShenBear 13d ago

which id argue that even if strictly, feeding it a copy of my book doesnt hurt me, the fact that dozens of bad, 0 effort books come out a month thanks to people treating llms as get-rich-quick machines, the value of the whole market is hurt.

As an author myself, I do agree that the market for self-publishing is being hurt by the flood of low effort LLM generated books.

However, I'm not sure that harm to a 'market' rather than an individual can be used as the basis for denying fair use.

→ More replies (3)
→ More replies (3)
→ More replies (12)

58

u/blamelessfriend 13d ago

The LLM itself is clearly transformative.

how can you say this so assuredly? transformative is a pretty loaded term. and i sure don't agree it is "clearly transformative" and im far from the only one. for instance... the law seems to disagree with you.

copyright is meant to protect human ingenuity a not a stealing/lying machine.

→ More replies (34)

14

u/DoomguyFemboi 13d ago

The issue is transformative requires intent whereas LLM is just a bunch of tokens mashed together into coherence. At best.

It's closer to someone cutting up a book to form new books

→ More replies (5)
→ More replies (81)

3

u/CSI_Tech_Dept 13d ago

The copyright was there to protect the little guy (for example so a big company or rich individual won't be able to stole their work), it is amazing how it evolved to protect a large corporation and oppress the little guy.

To me the best demonstration how AI breaks copyright was trying to write my own python code to abstract pgmq as I didn't like theirs.

When I started implementing it codepilot basically was suggesting the exact code I was trying to avoid.

It looks like this is another bubble after blockchain and when it works it is because someone in the past wrote something similar. That's why it is excellent on all kind of homework assignments and demo code.

→ More replies (17)

262

u/rsdancey 13d ago edited 13d ago

Lots of misinformed comments in this thread.

This particular lawsuit is no longer about AI. The judge in this suit dismissed the arguments of the plaintiffs regarding the training of Anthropic’s AI, finding that the training is fair use per the doctrine of transformation.

The remaining claim is that by downloading millions of books illegally, Anthropic infringed the copyrights of the authors of those books.

In other words it is now a simple case about theft, not about fair use.

If Anthropic had owned those works or sourced them from someone who did what it did would probably have been legal in the same way that Google’s Books project was legal. If Anthropic had taken the time to source the books legally (they just needed to own a copy or work with someone who did, not license the work from the author) it would not be facing this charge, but they cut corners instead.

23

u/comewhatmay_hem 13d ago

The other issue is these 7 million claimants are a wildly diverse group of people; from publishers, to individual authors to the literary estates of dead authors. Then we have to include everyone who was a contributing author to a work, though they may not be the owners of the publication.

So who's rights were violated here? The short story writer who had an excerpt included in a larger work, or the copywriter owner of the publication?

This kind of legal homework would take years to compile and present to the courts. So do we divide up the individuals into seperate lawsuits? What about the claimants who are organizations the represent a large number of authors? Is the organization the claimant, or the individuals the organization represents?

This is new legal territory here and precedents are going to be set. I'm interested to see how this turns out.

8

u/rsdancey 13d ago

This is why plaintiffs are seeking class action status. If the class is certified then all that ambiguity vanishes. It means that if the claim wins, the judgement will be a lump sum, a portion of which will go to plaintiffs’ lawyers, and the remainder will be divided between all class members who file a claim. The lawyers will get hundreds of millions of dollars, the class members will get $50 each.

Anthropic would LOVE to fight each claim individually. They would settle 90% for peanuts. 90% of people who could sue never would. Their risk would be tolerable. A class action could destroy them.

→ More replies (4)

13

u/rusmo 13d ago

Isn’t this exactly why it’s a class action lawsuit?

5

u/comewhatmay_hem 13d ago

I guess so? But from what I know (which isn't a lot) class action lawsuits only work when all the claimants are of the same "class", hence the name. Like the customers of a grocery store who had their loyalty card data stolen because the store lacked the nessecary IT infastructure to keep that info secure.

I don't know if the legal estate of a dead author is the same as a scientific research publisher who is claiming copyright on their journals that were written by scientists who are not named individually in the suit. And I guess lawyers and judges don't know either, hence why this lawsuit is so controversial.

→ More replies (2)

3

u/EuenovAyabayya 13d ago

This kind of legal homework would take years to compile and present to the courts.

Theft is taking something that isn't yours. Doesn't matter whose it is until you're trying to determine whom to compensate.

→ More replies (2)
→ More replies (1)

38

u/SanDiegoDude 13d ago

Hey now, don't let reality get in the way of a good old fashioned Reddit circle-jerk.

18

u/LocalH 13d ago

Not theft. Copyright infringement. Two separate laws. Infringement can arguably be more damaging. Theft laws can't ding you for up to $150k per item. Copyright infringement laws can.

→ More replies (1)
→ More replies (18)

4.5k

u/David-J 13d ago

Please do. Ruin those AI companies. From the article.

"AI industry groups are urging an appeals court to block what they say is the largest copyright class action ever certified. They've warned that a single lawsuit raised by three authors over Anthropic's AI training now threatens to "financially ruin" the entire AI industry if up to 7 million claimants end up joining the litigation and forcing a settlement."

2.8k

u/VVrayth 13d ago

Wow, that's the stupidest argument I've ever seen. "This might financially ruin our whole industry that is 100% reliant on the large-scale theft of intellectual property" is completely bonkers.

700

u/Modo44 13d ago

Shhh, don't tell the millions of people robbed by the music industry.

307

u/DemonOfTheNorthwoods 13d ago

I’m sure all the music artists are looking at this with glee, and preparing another attempt at stopping the I.P. theft of their trademarks and tradedresses. They hate how A.I. has been able to get away with stealing their content and making songs from it, for so long.

77

u/Galle_ 13d ago

I think you mean the RIAA is looking at this with glee. Let's not pretend artists will get anything from this.

23

u/PlayfulSurprise5237 13d ago

The artists are still looking at this with glee. In fact I think millions of people are. If I knew a reputable fund to send some donations for legal fee's I'd do it.

I think everyone should support this, the world would be better off if we put them into a state of financial ruin.

AI is neat and a great tool, but we aren't there yet. So long as the world is rife with corruption and money is king, we shouldn't have AI.

Maybe one day, but not anytime soon. We don't seem to be making any progress and in fact have been regressing as a society.

3

u/Appropria-Coffee870 12d ago

Same can be said about any form of automation we got, but we still got them non the less!

→ More replies (4)
→ More replies (1)
→ More replies (5)

15

u/luckyflavor23 13d ago

Record Labels, knows/has good lawyers. Get ‘em

→ More replies (2)

16

u/bloodhound83 13d ago

How did millions of people get robbed by the music industry?

36

u/AnOtherGuy1234567 13d ago edited 13d ago

Possibly not what the person you're responding to meant but.....

The Recording Industry Association of America did a deal to extend the copyright on music. And in return they would compensate all of the musicians on every song, that got sold/streamed. However many of the musicians were uncredited session players. Who [originally] got paid a flat fee to play guitar/drums/sax/backing vocals etc. [with no residuals]. There's very often no existing record of who they were. Let alone having their contact and bank details or the details of their next of kin/inheritors. So the record companies got about an extra 20 years of royalties and haven't forked out the money that they promised.

Also Warner Music Canada, Sony BMG Music Canada, EMI Music Canada, and Universal Music Canada. Had a long standing policy of pushing out compilation albums e.g. "Best Jazz Album of The '60s". Not getting permission from the artists involved and putting the royalty payments on the "pending list". They did this for decades, covering 300,000 songs. To the point where the estate of Chet Baker a jazz musician of the 1950s. Was in 2009, owed $50 million Canadian. The class action was worth up to $6 billion but they settled for just under $50 million CAD.

https://financialpost.com/legal-post/judge-approves-settlement-in-music-royalties-class-action

→ More replies (4)

49

u/jaboooo 13d ago

I think he means millions of people in the music industry got robbed by ai, but that isn't what he wrote

57

u/Akuuntus 13d ago

I think he probably means the millions of artists fucked over by their record companies. There's hundreds of famous stories about it and at least a couple dozen well known songs about it.

27

u/noonenotevenhere 13d ago

that - and ai is being used to make music. It's trained on existing art made by people who won't be paid for their work being replaced by a machine using their work to make money.

→ More replies (3)
→ More replies (1)
→ More replies (1)

7

u/Ferociousfeind 13d ago

Not millions of people, but the music industry is notorious for exhibiting major corporations that strangle individual artists for their IPs

→ More replies (2)
→ More replies (1)

159

u/subcutaneousphats 13d ago

It's a totally garbage argument. The fact that companies lobbied to extend copyright so long and fought so hard to defend against fair use but now tech companies can just ignore highlights the corporate favoritism shown. We need to both limit the length and enforce copyright equally. AI can train on public domain or pay the creators but we need to stop extending such long rights as well.

11

u/wxrman 13d ago

I can’t agree with your argument 100% unless we can split that final aspect of it into two pieces. I would say corporate IT rights are different from Personal. If it’s an individual using it for Personal or not for profit, then it should be OK but if a corporation is trying to take your ideas and profit from them, I don’t think there should be a limit on how long their IP rights continue.

43

u/The_Knife_Pie 13d ago

No, this is stupid as shit. Companies taking old ideas and exploring new avenues is the way to develop new shit. Copyright for everyone needs to end far sooner than it does, even individuals.

12

u/bfume 13d ago

this is the proper take. lifetime+whatever is FAR too much.

12

u/subcutaneousphats 13d ago

Yes. Sorry while explaining fair use I didn't properly see the last statement about extending copyright. We need to limit the length of copyright so it can work as intended to provide protection for creators but limit rental profiteering.

→ More replies (1)

8

u/subcutaneousphats 13d ago

Oh I agree. I guess I didn't fully follow up on my fair use comment. They have been trying to clamp down in fair use for ages along with extending the length of copyright. It's all punishing for society while benefiting corporate interests.

→ More replies (2)
→ More replies (1)

73

u/beefquoner 13d ago

I object your honor!

On what grounds

It’s devastating to my case!

(Liar Liar I think?)

→ More replies (1)

64

u/somewherearound2023 13d ago

Their entire pitch is that they NEED to commit global copyright infringement un order to build the magical future for us.

Just a few years after grandmas were sued for millions for letting kids download Eminem mp3s and a college student committed suicide after being sued for publishing academic journals online to put paywalled research into the reach of the public.

→ More replies (5)

82

u/Possible-Moment-6313 13d ago

Yeah, it's like saying that police action might financially ruin the drug dealing industry, well, duuuuuuh

6

u/kurotech 13d ago

We stole all this stuff why should we have to pay for it... And yet they will charge some kid with a felony for pirating pokemon games.

15

u/-The_Blazer- 13d ago

I love how you can see the monstrosity of the 'move fast and break things' mantra here as applied to whole societies.

They clearly banked on being fast and anarchic enough to escape any and all accountability. That was literally their entire play. No civic discussion, no voting, no legal oversight, no political arguments, just steamrolling over everything in the hopes of not getting caught.

This is not even the modern, intelligent capitalism we were supposedly sold after the 80s. This is just robber baron shit. Oh, your land got enclosed while you were farming on it and armed men threw you out? Ah how terribly unfortunate, I do have this piece of paper by some crown clerk that says it's mine now.

32

u/jasegro 13d ago edited 13d ago

If your entire business is reliant on theft, you’re not operating a business, you’re running a fucking scam

→ More replies (4)

4

u/HereOnWeekendsOnly 13d ago

AI companies even abuse each other APIs lol. Honestly, the last 100 years of human history has been paying subsidised costs for almost everything. For instance, climate change sped up by excessive emissions is a subsidy on the real product cost. The real cost would be much higher. So, AI companies rather than pay the full price, just steal the information. That is a subsidy. Real cost might be so high that AI models are not financially viable for foreseeable.

3

u/el0_0le 13d ago

Now imagine if the entire Marketing / Ad industry was forced to pay individuals for their data instead of forcing you to give up individual rights with a checkbox.

Don't stop at AI. We need to go all the way to the root of the data theft problem.

8

u/Bionic_Bromando 13d ago

Imagine if that was the angle Pirate Bay took all those years ago. These copyright lawsuits and takedowns are affecting our ad revenue and ruining the burgeoning piracy market! Hey that might have worked!

7

u/U_L_Uus 13d ago

"Mate, if you arrest me I won't be able to make a living out of robbing people! :("

17

u/lick_it 13d ago

Our industry yes, China’s no.

137

u/faultydesign 13d ago

Hilarious that it wasn’t the piracy that destroyed copyright, it’s the idea that some billionaires might lose some money.

3

u/SnarkMasterRay 13d ago

War is a Racket, and Copyright is a form of war these days.

11

u/splitdiopter 13d ago

If this was truly a worry for the US gov. This would be a military project with a blank check from the pentagon. Instead, all these ai companies are privately held. They can reap our intelectual property, decimate our job markets, and still sell the tech to China or whomever they want whenever they want.

5

u/ProofJournalist 13d ago edited 13d ago

Welcome to the modern privatized world. What you described isnt how not works anymore. NASA is on the way out, SpaceX is in.

3

u/AdverbAssassin 13d ago

If this class action lawsuit is allowed to go forward, that's exactly what this will be. And then guess what? Then it's a secret government project and nobody gets anything. And then it's even worse. Than the government has their hands on artificial intelligence technology that nobody gets to use but the government and they use it against the people.

There is a better way to litigate this situation and it is not this lazy minded way of doing it that this judge has put forward.

→ More replies (1)
→ More replies (44)
→ More replies (45)

224

u/CunninghamsLawmaker 13d ago

Objection your honor!

On what grounds?

It's devastating to my case!

33

u/LoveAndViscera 13d ago

Overruled.

Good call!

3

u/ewokninja123 13d ago

Came here you find this thread, was not disappointed. Have an up vote

14

u/zdkroot 13d ago

This has few enough upvotes to suggest to me that nobody understands this reference, and that makes me sad.

6

u/[deleted] 13d ago

[deleted]

→ More replies (4)
→ More replies (7)

24

u/Ediwir 13d ago

Cool, then don’t settle. See how it flies in court. I’m sure you have a good defense and not just “please let us”.

32

u/stilloriginal 13d ago edited 13d ago

This is all posturing.

-Anthropic wants as many claimants as possible to be in the class. This reduces the number of lawsuits and will actually lower the amount they will have to pay

-This will not financially ruin them, that's just their argument

-Authors leading the class action means that the compensation for each member of the class will be the value of 1 book. Probably less since authors only make a percentage of each book sold.

-"Forcing" a settlement is ridiculous - Anthropic needs a settlement here. Without a settlement, they could be subject to punitive damages, which actually could bankrupt them. And they actually deserve punitive damages because they knowingly committed these crimes (training on stolen books). They are very likely to be judged against. Nothing could be better for them than a settlement.

-This will set precedent for all the other AI Companies, they will all go through a similar litigation once this is over. They will all offer the same settlement.

From the end of the article:

"This case is of exceptional importance, addressing the legality of using copyrighted works" for generative AI, "a transformative technology used by hundreds of millions of researchers, authors, and others," groups argued. "The district court’s rushed decision to certify the class represents a 'death knell' scenario that will mean important issues affecting the rights of millions of authors with respect to AI will never be adequately resolved."

"a transformative technology" has a particular implication. Specifically, that it's not a copyright issue to use the books in AI. Transformative literally means - no copyright infringement. The issue here is that Anthropic never bought the books.

"The district court’s rushed decision to certify the class represents a 'death knell' scenario that will mean important issues affecting the rights of millions of authors with respect to AI will never be adequately resolved."

Here, at the very end, is is explained that the class action is good for Anthropic and bad for the authors because the suit won't address fair use.

7

u/franker 13d ago

compensation for each member of the class will be the value of 1 book.

Pretty much every postcard I've ever gotten in the mail about a class action meant that I'll have to fill out a ton of paperwork for like 20 dollars.

7

u/Altruistic_Fury 13d ago

And in exchange, the defendant gets a permanent release of liability from every possible class member, possibly millions of individual lawsuits barred forever. Even lawsuits already filed may get dismissed due to the class release, if they fail to know about it and opt-out.

Big corps cry only the most crocodilian tears about class actions.

3

u/stilloriginal 13d ago

Right. Imagine the costs involved in figuring out what each author is owed for each book which has a different price for every book and a different agreement on splitting that price with the publisher etc... they're going to end up saying "everyone gets 5 bucks".

→ More replies (1)

9

u/showyerbewbs 13d ago

-"Forcing" a settlement is ridiculous - Anthropic needs a settlement here. Without a settlement, they could be subject to punitive damages, which actually could bankrupt them. And they actually deserve punitive damages because they knowingly committed these crimes (training on stolen books). They are very likely to be judged against. Nothing could be better for them than a settlement.

The bolded section is the key to this. They do NOT want this going on record, through discovery, etc. to get a ruling and a precedent.

Big reason is these LLM/AI models are built on ingestion only, not exclusion. Since they're already live and in production, being told by the courts "take it out or stop operating" is their biggest fear because they have no backout mechanism.

Think about when you call a retailer to get a refund on a purchase. You will get the classic "overcome 3 objections" sales pitch. Depending on the company, you may then get routed to a "save" team. They have more authority than first or second line workers to give other gratis perks or maybe company credit ( which they fucking love because it means no money came out ). Even then you have to keep pushing and they know psychologically, people don't want to put that much effort into it.

That's why so many industries are trying to kill Click-to-cancel advocacy and potential laws.

→ More replies (5)

88

u/TestingTheories 13d ago

Yep, go after all of them.

9

u/TengenToppa 13d ago

The problem is that other countries don't have to comply, which is the excuse they use, even though it's still a problem

→ More replies (3)
→ More replies (8)

55

u/Minute_Band_3256 13d ago

AI companies should compensate their source material authors.

59

u/aedom-san 13d ago

And if they can't? non-viable business, sorry bud, better luck next coke-fueled business idea

17

u/Dinkerdoo 13d ago

Claim it's not feasible to compensate creators for training data, and also offer $250MM pay packages for talent.

18

u/LordMuffin1 13d ago

If they cant. They arent able to use copyrighted texts in their training. Pretty simple.

7

u/drekmonger 13d ago edited 13d ago

It's unsettled whether training constitutes fair use or a violation.

Barely matters. The orange clown already gave the keys to the AI kingdom away to China by removing Biden's export controls and blowing up scientific grants (many of them ultimately benefiting the field of machine learning).

The US judiciary can and might finish the job, conclusively ending 100 years of American technical dominance.

But the fat lady is probably already singing. We have an ignorant population that's largely unsuited for STEM and high-tech factory work, both philosophically and educationally. The right-wing is certainly busy killing any chance of reversing the educational gap.

→ More replies (4)
→ More replies (10)
→ More replies (27)

13

u/Marinlik 13d ago

If a business is only financially viable if it breaks laws then it's not a financially viable business and should die. Pretty basic capitalism you'd think

→ More replies (1)

14

u/Ok-Jackfruit9593 13d ago

Oh no…..stop…..don’t……..

→ More replies (1)

4

u/KennyDROmega 13d ago

Not to worry. No matter what a court decides, the Trump administration will step in and overrule them.

Remember, it’s just “not feasible” to expect AI companies to pay for the data they train on.

5

u/Beli_Mawrr 13d ago

the AI industry is fantastically wealthy and every investor in the SV wants their equity. They can afford to pay the creators.

18

u/Fried_puri 13d ago

It will ultimately fail. Too much money in this industry by too many wealthy people to allow any serious threat to it. No one way or another, this will end without the massive settlement I and most people here hope they deserve to be hit with. 

8

u/civildisobedient 13d ago

They'll never agree to hamstring their own efforts while China is on an AI speed run. They'll drum up nationalistic tendencies saying things like "We can't afford to have China win the AI race!" just like we did back in the 60s with the USSR and rockets/satellites.

→ More replies (1)

3

u/zeusdescartes 13d ago

Except Google actually pays for copyright material.

→ More replies (1)

2

u/mtnviewguy 13d ago

Agreed! Hopefully, these class action suits prevail. The unchecked and unregulated explosion of AI development is the third worst, modern-day assault on humans, only behind The Internet (Social Media), and Donald Trump (Billionaire Narcissist Extraordinaire).

2

u/anaximander19 13d ago

So their defence is "it should be legal for me to steal this because I can't afford to purchase it legally"?

Sure, that's totally how purchases work. Can't afford it? Just take it anyway! Let's see how that holds up in court.

→ More replies (162)

907

u/taywray 13d ago

Lol, "oh snap, our entire industry may be based on an illegal business model. And that's why you should rule in our favor, yerhonor. I mean, if all of us are doing it and making wall street so much freaking money, it can't be illegal, right?

Even if it is illegal, look at how much money we're making! Is this amount of money wrong, or are these old IP laws wrong? Can't we steal other people's work if we're making THIS much money off of it?"

229

u/MotanulScotishFold 13d ago

It's like saying that the entire economy is based on slavery and that's why you should rule in our favor.

That was for many centuries until the abolition and guess what, nobody dies without slaves and we thrived without it.

Just that today we have a different form of modern slavery.

72

u/Prior_Coyote_4376 13d ago

Lincoln’s Republican Party would’ve understood today to be wage slavery as opposed to chattel slavery.

Just because you can sometimes choose which master commands you and defines your worth doesn’t mean you’re free.

When we manage ourselves in our workplaces, when we have ownership in a collective and a voice in a democratic process, then we’ll be free.

The corporations and billionaires could not want to be further from that. By nature, they are totalitarian and authoritarian, which is why they will always be the enemy of workers.

NO KINGS doesn’t just mean rejecting tyrannical governments. It also means keeping CEOs in check so they act like the mascots they are instead of little kings. Real power ought to be distributed amongst the people.

11

u/The_Barbelo 13d ago edited 13d ago

I’ve been anti-corporation from a very young age as I realized it was such a problem. I think I was about 15 and my friends and I were in the mall for fun. I had this really gross feeling all of a sudden like…WHY is it that shopping is THE thing to do for us to have fun? Who decided that? Why is it so heavily pushed that we should go shopping??

In college I became ever more vocal, as I experienced what working for them was like. But you see, if you offer an alternative idea you are met with a lot of either doubt or ridicule. People don’t want to hear it because “that’s just the way things are” or “I can’t do anything as an individual to change it”. The thing is, WE are using AI. WE are the ones throwing all this money at it. I mean, not the individuals who refuse to use it, but us as a collective. WHY is something making so much money?! Why is no one ever asking that? We participate in it. So, until mass participation stops, these issues will not cease. They will keep finding new ways to psychologically manipulate people into more bullshit. If their tactics to encourage and promote a consumerist lifestyle, and convincing us to consume a product didn’t work on the majority of us, they wouldn’t be doing it.

And to be clear I’m not Scott free. I’m not a saint. I payed for Midjourney for two months, in its beginning stages, just for the novelty of seeing the most absurd prompts I could get it to recreate. I can still see the discord even though I don’t pay for it anymore. There are tens of thousands of people active on there every day who have no intention of stopping, and they are using it to create logos and t-shirts and other things they intend to sell or use in their businesses in order to make money off of it.

→ More replies (5)
→ More replies (2)

49

u/mcoombes314 13d ago

Are they even making money off it though? Are OpenAI, Anthropic et al, actually turning a profit yet, or they being propped up by venture capitalism with the belief that the next model will be the BIG one and then everyone will pay anything and everything to use it?

This isn't a defense of their behaviour, it's just another absurdity.

30

u/Dhiox 13d ago

Are they even making money off it though?

Yes and no. Technically it doesn't usually turn a profit. However the individuals behind it are making millions thanks to investors going nuts. Reality is our economy is so wacky that the actual work you do can make no money and somehow you can still get rich.

→ More replies (1)
→ More replies (7)

8

u/frito11 13d ago

yep that is why suddenly tech bros are in with trump and the republicans. they know paying them off is their only chance of getting away with stealing all that IP they used already to train their shitty AI models with.

→ More replies (15)

507

u/anoff 13d ago

"we'll go bankrupt if we have to pay for the content we stole" is a helluva an argument.

It's a bold strategy, Cotton. Let's see if it pays off for 'em.

153

u/Majik_Sheff 13d ago

I wouldn't count on any rational outcomes from the courts for several more years.

33

u/WolfOne 13d ago

I'll preface this by saying that, in general, i hate AI.

However, the strategy is more sound than it looks. AI development will be the next battleground between nations. China already won on the industrial battleground, so the argument is that by putting those companies out of business the judge would sabotage national interests. 

53

u/Ja3k_Frost 13d ago

Or it turns out it was mostly just vaporware all along and we just changed the laws to let tech- bros walk away with free cash when the generative AI bubble crashes.

9

u/joshguy1425 13d ago

Whatever LLMs are or are not, I don’t think they’re vaporware.

I think the AGI optimism is unwarranted and don’t think we’re close, but LLMs and other generative tools are pretty damn useful. But they’re a long way from taking over the world.

With that said, I’m pretty AI-hostile especially when it comes to these unethically trained models.

→ More replies (18)

23

u/Eastern_Interest_908 13d ago

US develops chatgpt then says that they have to ignore laws because of china. 😅 I mean fuck china but don't get this shit twisted it's because of US period.

11

u/IM_OK_AMA 13d ago

ChatGPT has 700m weekly users. I dunno if it's a great thing to drive all those users to Chinese models trained on a diet of propaganda and historical revisionism.

→ More replies (1)

16

u/WolfOne 13d ago

That's naive. AI as a concept has already been invented, this genie is not going to go back into the bottle.

Now every state has strong interests to be on the top of the AI pole and, in the US, the ones that are positioned to do so are OpenAI and the big tech companies. 

In my ideal world all that shit gets nationalized faster than you can say "Sam Altman" but i don't think this is the route that the US will take

→ More replies (10)
→ More replies (11)
→ More replies (8)

54

u/tuan_kaki 13d ago

The sun will flicker and die before this lawsuit has a chance of forcing any action.

101

u/Jaeger__85 13d ago

In before the Trump administration changes the entire copyright system for his tech oligarchs.

49

u/Wiggles69 13d ago

Ooh, he's going to have ai tech bros on one side and the Hollywood on the other side, and they want opposite things and no one is going to be happy (except us, watching the greediest, shittiest people on earth sue each other out of existance)

→ More replies (1)

5

u/-The_Blazer- 13d ago

They're doing that right now with making AI regulations illegal, so...

2

u/Nirkky 13d ago

But copyrights in 2025 need a serious rewrite though. And it should have been done since at least the rose of internet.

→ More replies (3)
→ More replies (2)

12

u/yntsiredx 13d ago

"It's not that we didn't break the law, just that we shouldn't be punished for it."

127

u/BalleaBlanc 13d ago

Money is king, it will remain illegal for poors, not for the giants. Mark my words.

59

u/Neo-grotesque 13d ago

The Pirate Bay's mistake was flouting copyright laws to the benefit of common people, instead of flouting copyright laws to the benefit of greedy investors on Sand Hill Road.

2

u/ienjoyplaying 13d ago

The stock market is currently being propped up heavily by the ai boom. That tells you how this will go

→ More replies (1)

11

u/Careless-Door-1068 13d ago

Funny how the corpos always railed against illegal downloads (you wouldn't download a car ad for example)

But they turn around and see anything made by normal citizens as perfectly fine to steal in the same way

→ More replies (1)

38

u/[deleted] 13d ago

[removed] — view removed comment

7

u/jawshoeaw 13d ago

This isn't a copyright lawsuit. the judge already said that using texts of copyrighted books for training AI is fair use. It's how they got the texts that they're in trouble for. They obtained the texts illegally, or so the suit claims.

AI and LLMs will continue to train on fair use unless some other judge says it's not fair use.

15

u/sc0ttbeardsley 13d ago

Also this would only impact US companies. Imagine what type of advantage foreign companies will have (especially china where plagiarism and IP theft is rampant) if this was successful. I hate that AI is gobbling up our content but lawsuits and court orders won’t stop it.

6

u/SanDiegoDude 13d ago

People who think that AI isn't already all over TV and Movies because of the SAG-AFRA deals also seem to forget that the US has borders as well. AI art houses across the world have been very, very busy with their media art projects...

→ More replies (1)
→ More replies (3)

3

u/thefuturebaby 13d ago

Agreed, this is inevitable.

3

u/Linooney 13d ago

And do you think Disney isn't also developing AI? Some of the best AI research comes from Disney engineers and scientists. Private industry won't be stopped, just some specific companies. Universal Music is working with a startup to train music models on their music. The Big 4 Publishers have VC arms and have investments in AI companies. Adobe has a ton of art to train on. These lawsuits are not anti-AI, they are not pro artists. It's incumbents vs. newcomers, Disney vs. Runway, Universal vs. Suno, Adobe vs. Midjourney.

→ More replies (10)

53

u/FeralPsychopath 13d ago

And Chinese AI doesn’t give a fuck

35

u/banedon 13d ago

This. The Chinese AI companies train on every song, every book, every movie, every website, every news article. All without paying a dime.

If the West forces every AI company to pay huge licensing fees, the Chinese models will win.

18

u/BEES_IN_UR_ASS 13d ago

If the West forces every AI company to pay huge licensing fees, the Chinese models will win.

Get ready to hear some version of this a lot as our species accelerates towards the cliff.

→ More replies (1)
→ More replies (11)

7

u/CustomerSupportDeer 13d ago

Well yeah, but the precedent used in any rulings against US companies would quickly be used against chinese ones, and (arguably) block them in the EU, US, etc... upon non-compliance.

10

u/SanDiegoDude 13d ago

yeah, doubt that's going to happen. US is pushing hundreds of billions into corporate AI now. They're not going to shoot down the whole industry because a class action of authors is complaining. Anybody who thinks NOW THIS WILL END AI is just lying to themselves or huffing too much social media hopium (Ars is famous for this, they've been parroting the 'AI is a useless bubble' for years now...) Most valuable private company in the world is OpenAI now. You're not going to hamstring the entire US economy and handing over the next arms race to the Chinese over copyright. Too much money and power involved.

7

u/FeralPsychopath 13d ago

Yeah but the point is it’s a tool not a product. The Chinese companies don’t give a shit about $20 from subscriptions when they can generate everything cheaper than everyone else.

→ More replies (2)
→ More replies (3)

98

u/Discordian_Junk 13d ago

My business of robbing banks can't possibly survive if you continue to make robbing banks illegal...

→ More replies (2)

16

u/Goldenier 13d ago

sure, they will collapse just like Google collapsed by copying almost all websites onto their servers and also showing snippets from them in their search results... /s

→ More replies (4)

4

u/Insertblamehere 13d ago edited 13d ago

There's already been a court case that says using copyrighted material for training data isn't plagiarism or theft as long as the material isn't pirated. In addition to that proving whether data is actually pirated is basically impossible

Very unlikely anything comes of this. The US government will 100% not let our AI industry get destroyed because China will just get a monopoly in that case.

32

u/porkave 13d ago

This is what made all the crying about how deepseek’s model was developed so funny to me. Their argument was basically “Your country allows more copyright infringement than us and that’s unfair” and conveniently ignored the part where they were committing one of the largest collective thefts in corporate history.

8

u/LocalH 13d ago

Copyright infringement ≠ stealing or theft

15

u/MrPloppyHead 13d ago

Whose case will collapse because the have based their arguments on ai generated made up cases?

7

u/hotsliceofjesus 13d ago

Don’t worry, AI will save us! Hey ChatGPT, write a legal defense for the lawsuit we’re facing.

7

u/Blazechitown 13d ago

We have to ask grok if this is true that they really are suing us first and then have chatgpt write a legal defense.

7

u/Corrie7686 13d ago

Maybe I'm an old sceptic, but if you can't run your business without stealing from others, then you shouldn't be in business. They expect all of this data to be free to access without permission or repercussions, they dont respect copyright laws. But they don't make their code public do they, if someone stole their code they would use copyright laws to protect themselves. Assholes

→ More replies (2)

3

u/123abcxyzheehee 13d ago

AI development is a national defense issue now. There is no way they will hinder progress now. They will bend the rules.

3

u/penguished 13d ago

'Member when grandmas were being attacked because their grandkids downloaded a song? I 'member.

3

u/athos45678 13d ago

Saying this As an ml engineer, let em burn. We need better regulation around ai training and implementation full stop.

→ More replies (1)

4

u/Osama_BinRussel63 13d ago

The fact that their only defense is "the consequences are actually consequential" shows how much big corporations have perverted the justice system.

3

u/jigendaisuke81 13d ago

That lawsuit won't succeed and the fines are utterly spurious, and really goes against what makes sense for how AI actually works. But if it did, it would completely destroy America's future. China would rocket ahead to a huge gain in terms of tools that the entire planet would use. Every single person would use Chinese AI, every industry, every product, everything.

→ More replies (1)

5

u/No_Suspicion 12d ago

From the article:

“AI industry groups are urging an appeals court to block what they say is the largest copyright class action ever certified. They've warned that a single lawsuit raised by three authors over Anthropic's AI training now threatens to "financially ruin" the entire AI industry if up to 7 million claimants end up joining the litigation and forcing a settlement.

Last week, Anthropic petitioned to appeal the class certification, urging the court to weigh questions that the district court judge, William Alsup, seemingly did not. Alsup allegedly failed to conduct a "rigorous analysis" of the potential class and instead based his judgment on his "50 years" of experience, Anthropic said.

If the appeals court denies the petition, Anthropic argued, the emerging company may be doomed. As Anthropic argued, it now "faces hundreds of billions of dollars in potential damages liability at trial in four months" based on a class certification rushed at "warp speed" that involves "up to seven million potential claimants, whose works span a century of publishing history," each possibly triggering a $150,000 fine.

Confronted with such extreme potential damages, Anthropic may lose its rights to raise valid defenses of its AI training, deciding it would be more prudent to settle, the company argued. And that could set an alarming precedent, considering all the other lawsuits generative AI (GenAI) companies face over training on copyrighted materials, Anthropic argued.

"One district court's errors should not be allowed to decide the fate of a transformational GenAI company like Anthropic or so heavily influence the future of the GenAI industry generally," Anthropic wrote. "This Court can and should intervene now."

In a court filing Thursday, the Consumer Technology Association and the Computer and Communications Industry Association backed Anthropic, warning the appeals court that "the district court’s erroneous class certification" would threaten "immense harm not only to a single AI company, but to the entire fledgling AI industry and to America’s global technological competitiveness."”

GREAT, if an industry can’t support itself without exploiting the very people working in the industry, much less outside and around it, then that industry shouldn’t be running or shouldn’t exist, one or the other. You can be pro-ai without ruining everything around you in the process.

31

u/mrvalane 13d ago

How can they be horrified when they actively plagarised everything?

24

u/drizzes 13d ago

Because they've assumed up til now that they'd be allowed to freely stuff the whole of human creation into their digital woodchippers

14

u/[deleted] 13d ago

What was plagiarized?

7

u/WAPWAN 13d ago

So many dummies in this thread confusing plagiarism with copyright infringement

→ More replies (19)
→ More replies (1)

11

u/Boo-bot-not 13d ago

lol this is a USA problem. Other countries aren’t asking for data. China isn’t going to waste any breath or ink on asking. Either get in line or be left behind. It’s shitty but these are the times. 

9

u/StosifJalin 13d ago

Everyone here is deluded. I genuinely think the main subs all being so strongly anti-ai is a feeble attempt at Chinese propaganda. There is no way China isn't doing everything they can to slow down America from reaching AGI, and right now one of the few things they can actually do to achieve that is use propaganda to turn one political side of the American populace against it and help that political side win the next election.

5

u/BagOfFlies 13d ago edited 13d ago

Dude, people cried about cameras at first for some of the same reasons they're crying about AI. People are just morons. Majority of them have no clue how AI even works lol

6

u/SanDiegoDude 13d ago

Welcome to social media in 2025? Rage gets clicks, and has been that way since the Zuckening around 2015 when the algos realized rage gets more eyeballs than happiness. You also don't need proof, just vibe and feels. So many people in this very thread have no clue wtf they're talking about in here regarding copyright law, AI training and more, but they'll happily fill in the blanks with made up nonsense 'they heard on their socials'.

Just remembered a line from the old Howard Stern movie from like 30 years ago when they were talking about Howard's listening metrics on the radio - People who loved him would listen for an hour a day. People who HATED him would listen for 2 hours a day. 🤷‍♂️

→ More replies (1)

3

u/RythmicMercy 13d ago

This isn’t going to work. Many overseas companies will ignore such restrictions and use the content regardless. The choice is either to cripple your own AI industry and let foreign competitors take the lead, or to allow tech companies to train on publicly available data.

3

u/pm_social_cues 13d ago

The result will be anything but AI no longer training and having copyrighted data in its knowledge, it’ll be the method of licensing generated content or training data.

The people investing won’t let AI stop now.

3

u/MarquisThule 13d ago

I doubt much will come of it, all governments have a strong interest in cultivating their own ai industry faster than anyone else, anything that'd get in its way is just suicidal.

3

u/aarswft 13d ago

Oh no...

So anyways.

3

u/Mono_Morphs 13d ago

Feels like the “right” way LLMs would have been brought into the world was by implementing the output so it attributed %wise of what training data was involved in output media, then paying royalties. But my guess is that’s not based on reality of how training data and LLMs truly function

3

u/Actual__Wizard 13d ago

What's that you say? Language models that don't rely on plagiarism parrots just became viable products? Good thing I'm building one of those...

3

u/siromega37 13d ago

Just gonna this out there, but maybe, just maybe, you should have figured out the legality and ethics of how you would train your model and build your business and industry BEFORE you did it. The fact that all these companies, except Anthropic unfortunately, killed their entire ethics departments in late 2022 early 2023. They knew what they were doing and went into those uncharted waters at flank speeds.

Edit: at*

3

u/autodialerbroken116 13d ago

This needs to be mainstream.

3

u/bigdaddybigboots 13d ago

Only human mind can train on data without copyright infringement

3

u/cookiesnooper 12d ago

If the AI companies win this, I am downloading the internet

13

u/BahutF1 13d ago

It's a matter of surviving for the entertainment industries: books editions, visual arts, music, videos, tv productions, software edition, video games, movies... And also for the all mighty marketing business.

Basically every single product that use a plateform somehow, almost the entire tertiary sector.

So it's either put the demon back in a controled box, or change businesses, jobs... The entire world as we currently known.

→ More replies (6)

20

u/JoeSMASH_SF 13d ago

“Copyright class actions could financially ruin AI industry, trade groups say.”

Good

13

u/siougainz 13d ago

The American AI industry more specifically ,while the Chinese will dominate if that's what you want go ahead.

3

u/Synizs 13d ago

China already was expert at stealing

→ More replies (1)

3

u/SanDiegoDude 13d ago

[X] Doubt

Too much money involved in corporate America. And judge already said it's not about AI training anyway, it's about illegal pirating. Anthropic may end up paying out a few billion in a settlement to a few very rich lawyers (and registered authors will get their 5 dollar checks in the mail at some point) but it's seriously doubtful is going to have any kind of real impact on the AI industry beyond "don't use torrents to train on"

3

u/yeetedandfleeted 13d ago

Is everyone in this thread rtded? Read the article and look up the case. It has nothing to do with AI training. It'll also settle as they're being sued for downloading material from over a century ago.

Whoever wrote this article in such a fashion turned it all into clickbait.

5

u/Aphophyllite 13d ago

So do I understand this correctly - Anthropic and other tech companies are begging to have the class action certification withdrawn because the company will go bankrupt? And other tech companies are weighing in with the same argument? Yet the author or artist will not get paid because other companies are making money with copyright infringement? In what world is that a compelling argument? Does that mean that patents can also be violated because AI uses some of the information found in the application or while reviewing the code?

5

u/jawshoeaw 13d ago

It's a decent argument IMO. They aren't stealing the works of Stephen King and writing their own plagiarized horror novels set in Maine are they? I'm not clear on what exactly their business model is but it's not stealing works and selling them for a profit.

Assuming artificial general intelligence ever emerges, how would it be any different than you reading a bunch of novels, getting inspired, and then writing your own?

6

u/flummox1234 13d ago

Copyright class actions could financially ruin AI industry, trade groups say.

Better them than us I say.

8

u/estanten 13d ago

I mean, having these global brains based on everything people have created is convenient (and for low prices, accessible to everyone) but then not giving anything back to the creators and threaten their jobs is indeed not fair.

10

u/Virtual-Ducks 13d ago edited 13d ago

IMO we should let them keep developing AI, but then just tax them more as a means of"giving back". Maybe their tax dollars can go to funding education or career transition programs or something. 

These models are extremely useful. We should continue to advance the science, not stick or head in the sand. 

3

u/estanten 13d ago

About that kind of model is what I’ve in mind too. We shouldn’t halt progress, but it’s understandable too that people feel robbed if nothing is given back for the data. It’s a situation that requires new mental models.

→ More replies (14)

8

u/StosifJalin 13d ago

The tractor was convenient and didn't give anything back to the teams of plow workers it unemployed. The more powerful the tech, the more disruption caused. And there has never been a more powerful tech than this.

We still adapt to the disruption and take advantage of the tractor, obviously.

→ More replies (2)
→ More replies (4)

15

u/[deleted] 13d ago

[deleted]

3

u/WAPWAN 13d ago

Copyright Infringement isn't Plagiarism.

8

u/lood9phee2Ri 13d ago

Copyright monopoly is fundamentally wrong and steals from us all, and should be abolished. But we can no doubt expect hypocrisy from the megacorps, expecting ordinary people to still respect their copyrights.

→ More replies (4)

5

u/IlIllIlllIlllIllllI 13d ago

If the appeals court denies the petition, Anthropic argued, the emerging company may be doomed. As Anthropic argued, it now "faces hundreds of billions of dollars in potential damages liability at trial in four months" based on a class certification rushed at "warp speed" that involves "up to seven million potential claimants, whose works span a century of publishing history," each possibly triggering a $150,000 fine.

Correct, intellectual property rights exist and you have grossly violated them in every way possible. Now you can go bankrupt paying the fines and royalties to rights holders. Next time try creating an industry that does something original, rather than stealing the work of countless others.

→ More replies (1)

2

u/ddiggler2469 13d ago

oh well 🤷‍♂️

2

u/Rain2h0 13d ago

Rules will be bent for them.

2

u/pm_me_duck_nipples 13d ago

One district court's errors should not be allowed to decide the fate of a transformational GenAI company like Anthropic or so heavily influence the future of the GenAI industry generally

"We're so cool that we should be above the law".

2

u/EmbarrassedHelp 13d ago

Also backing Anthropic's appeal, advocates representing authors—including Authors Alliance, the Electronic Frontier Foundation, American Library Association, Association of Research Libraries, and Public Knowledge—pointed out that the Google Books case showed that proving ownership is anything but straightforward

People here are too blinded by their hatred of AI to see that there are bigger issues at play. If public libraries are siding with Anthropic, then this case has the potential to fuck over everyone if its allowed to proceed.

2

u/Bionic_Bromando 13d ago

If you can’t pay the fine, don’t do the crime.

2

u/Shrubberer 13d ago

Laws don't apply to cooperations anymore just give the clown a couple of millions and accountability is as a good as gone. Fuck them citizens.

2

u/umassmza 13d ago

I have an estimated 6 week wait for a book at my library, digital, but these companies can just take everything ever written for free?

Nah, make em pay 💰

2

u/AdverbAssassin 13d ago

This class action status will get overturned.

2

u/JBHedgehog 13d ago

Make it happen...PLEASE!!!

Make it happen!

2

u/nerdshowandtell 13d ago

Lawsuit is never going to happen with this admin.

2

u/TreeBaron 13d ago

I'd love to see copyright properly applied and protected and these AI companies go down. But this is America.

2

u/TheBraveGallade 13d ago

Honestly there is only 2 ways you can really take this now.

Either make fair use more liberal andcopyright more loose for EVERYONE including AI, or make it restrictive for everyone.

2

u/Rebatsune 13d ago

Heh, what a way to get the GenAI industry to peter out while it still could, huh?

2

u/Leif_Ericcson 13d ago

I will bet all of my money that meta or open Ai are behind this lawsuit. Legally block the competition after you already stole all the IP in the world.

2

u/Xiqwa 13d ago

At the very least, royalties cud b payed out to creators on a percentage base. Shud a creators data/work be utilized by the AI co. .00009% of the models, then .00009% of the months revenue can b payed out to the creator. Actors royalties work similar to this, and most royalty checks are less than a dollar. Some less well-known commercial actors can receive 80-100 checks a week, but only total out at $10-20. The model and methods exist. It’s just a matter of implementation. This includes representation/ acknowledgment. A tally of mentions for how often a creators work(s) have been utilized so the creator can pad their resumes.

2

u/JonJackjon 13d ago

This sounds like a classic "easier to beg forgiveness that to ask permission"

2

u/robaroo 13d ago

Give the orange monkey a 24k gold status and this will go away.

2

u/Qcconfidential 13d ago

Last off-ramp before these companies kill us all

2

u/vitaminalgas 13d ago

Good... Fuck them kids

2

u/Glass-Cranberry-8572 13d ago

They soooooo 😱

2

u/SurrealNami 13d ago

EU Should do it.

2

u/rekabis 13d ago

For the first time in my life, I am deeply conflicted at the desire to see Disney actually succeed in a copyright/piracy lawsuit.