So they're admitting that their entire business model requires them to break the law. Sounds like they shouldn't exist.
Technology
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
It likely doesn't break the law. You should check out this article by Kit Walsh, a senior staff attorney at the EFF, and this one by Katherine Klosek, the director of information policy and federal relations at the Association of Research Libraries.
Headlines like these let people assume that it's illegal, rather than educate people on their rights.
Reproduction of copyrighted material would be breaking the law. Studying it and using it as reference when creating original content is not.
humans studying it, is fair use.
So if a tool is involved, it's no longer ok? So, people with glasses cannot consume copyrighted material?
Copyright can only be granted to works created by a human, but I don’t know of any such restriction for fair use. Care to share a source explaining why you think only humans are able to use fair use as a defense for copyright infringement?
What's the difference? Humans are just the intent suppliers, the rest of the art is mostly made possible by software, whether photoshop or stable diffusion.
It doesn't break the law at all. The courts have already ruled that copyrighted material can be fed into AI/ML models for training:
This ruling only applies to the 2nd Circuit and SCOTUS has yet to take up a case. As soon as there's a good fact pattern for the Supreme Court of a circuit split, you'll get nationwide information. You'll also note that the decision is deliberately written to provide an extremely narrow precedent and is likely restricted to Google Books and near-identical sources of information.
Have there been any US ruling stating something along the lines of “The training of general purpose LLMs and/or image generation AIs does not qualify as fair use,” even in a lower court?
You might want to read this post from one of the EFF's senior lawyers on the topic who has previously litigated IP cases:
https://www.eff.org/deeplinks/2023/04/how-we-think-about-copyright-and-ai-art-0
you know what? I like this argument. Software/Streaming services are "too complex and costly to work in practice" therefore my viewership/participation "could not exist" if I were forced to pay for them.
Hey if they want to set that precedent, so be it.
Oh, no no no.
Rules for thee, not for me.
That's the other edge of this sword.
I do love how AI has gotten Corporate Giants to start attacking the Copyright System they've used to beat down the little man for generations
Maybe because it's not the same corporations? We might be seeing a giant powershift from IP hoarders to makers.
Makers use the copyright system to their advantage as well though. If I write code and place it on github, the only thing stopping a mega corp stealing it is the copyright I hold.
Abolishing copyright is not a win.
Let's not kid ourselves that the copyright is stopping mega corporations from stealing your github code.
What's stopping them from hiring an engineer that basically rewrites your code? No one would ever know.
Copyleft enforcement is laughable at best and thats with legitimate non profits working on it (like FSF) and that's when it comes to direct library use without modifications and there's basically no history of prosecution or penalties for partial code copying (nor that there should be imo) that's even when 1:1 code has been found!
I feel like copyright has been doing very little in modern age and have yet to see any science that contradicts my opinion here. Most copyright holders (like high 90%) are mega corporations like ghetty images that hardly contribute back to the society.
Weakening copyright is a win
Another reason why copyright should be shortened... Society has changed massively in the last 100 years, but every expression of our modern society is locked behind copyright.
I keep thinking how great it would be if the federal government made a central server system to access digital content for free via taxes.
All public domain and publicly funded research and content, all in one place. Could also host owned content for people/entities and pay out royalties automatically based on consumption.
There are ways to make this fairly affordable to everyone via taxes, but maybe the big opportunity is it could also allow companies to train AI on all the data for a fat, but fair subscription. The value of that could easily pay for enough to shrink any tax costs for the public.
In general, if the US government were smart (and not currently tearing itself apart) it would be creating a generative AI public service like the postal service, potentially even relying on public government documents and the library system for training.
Offer it at effectively cost for the public to use. Would drive innovation and development, nothing produced by it would be copyrightable, and it would put pressure on private options to compete against it.
We can still have the FedEx or DHL of gen AI out there, but they would need to contend with the public option being cheaper and more widely available for use.
Not that I am a fan of the current implementation of copyright in the US, but I know if I was planning on building my business around something that couldn’t exist without violating copyright I would surely thought of that fairly early on.
"My profits from fencing your wallet could not exist if stealing your wallet were punished."
"Ah, you're right, how silly of me, carry on."
Then they shouldn't exist.
Too late
Sounds like a win to me
I'd be fine with this argument if these generative tools were only being used by non-profits. But they aren't.
So I think there has to be some compromise here. Some type of licensing fee should be paid by these generative AI tools.
You're basically arguing for making any free use of them illegal, thereby giving a monopoly to the richest and most powerful capitalists.
Humans won't be able to compete, and you won't be able to use the means of generation either.
I'm arguing for free commercial use being illegal, absolutely.
And that fee should scale based on who is using it for commercial purposes. Microsoft and Google should be paying far, far out the ass for their data.
I’m just trying to think about how refined AI would be if it could only use public domain data.
ChatGPT channels Jane Austin and Shakespeare.
That's not really how it would work.
If you want that outcome, it's better to train on as massive a data set as possible initially (which does regress towards the mean but also manages to pick up remarkable capabilities and relationships around abstract concepts), and then use fine tuning to bias it back towards an exceptional result.
If you only trained it on those works, it would suck at pretty much everything except specifically completing those specific works with those specific characters. It wouldn't model what the concerns of a prince in general were, but instead model that a prince either wants to murder his mother (Macbeth) or fuck her (Oedipus).
That's how it should be, but public domain has been crippled by Disney and co.
So... This may be an unpopular question. Almost every time AI is discussed, a staggering number of posts support very right-wing positions. EG on topics like this one: Unearned money for capital owners. It's all Ayn Rand and not Karl Marx. Posters seem to be unaware of that, though.
Is that the "neoliberal Zeitgeist" or what you may call it?
I'm worried about what this may mean for the future.
ETA: 7 downvotes after 1 hour with 0 explanation. About what I expected.
I think it's a conflation of the ideas of what copyright should be and actually is. I don't tend to see many people who believe copyright should be abolished in its entirety, and if people write a book or a song they should have some kind of control over that work. But there's a lot of contention over the fact that copyright as it exists now is a bit of a farce, constantly traded and sold and lasting an aeon after the person who created the original work dies.
It seems fairly morally constant to think that something old and part of the zeitgeist should not be under copyright, but that the system needs an overhaul when companies are using your live journal to make a robot call center.
I'd say the main reason is companies are profiting off the work of others. It's not some grand positive motive for society, but taking the work of others, from other companies, sure, but also from small time artists, writers, etc.
Then selling access to the information they took from others.
I wouldn't call it a right wing position.
It's interesting as it's many of the MPAA/RIAA attitudes towards Napster/BitTorrent but now towards gen AI.
I think it reflects the generational shift in who considers themselves content creators. Tech allowed for the long tail to become profitable content producers, so now there's a large public audience that sees this from what's historically been a corporate perspective.
Of course, they are making the same mistakes because they don't know their own history and thus are doomed to repeat it.
They are largely unaware that the MPAA/RIAA fighting against online sharing of media meant they ceded the inevitable tech to other companies like Apple and Netflix that developed platforms that navigated the legality alongside the tech.
So for example right now voice actors are largely opposing gen AI rather than realizing they should probably have their union develop or partner for their own owned offering which maximizes member revenues off of usage and can dictate fair terms.
In fact, the only way many of today's mass content creators have platforms to create content is because the corporate fights to hold onto IP status quo failed with platforms like YouTube, etc.
Gen AI should exist in a social construct such that it is limited in being able to produce copyrighted content. But policing training/education of anything (human or otherwise) doesn't serve us and will hold back developments that are going to have much more public good than most people seem to realize.
Also, it's unfortunate that we've effectively self propagandized for nearly a century around 'AI' being the bad guy and at odds with humanity, misaligned with our interests, an existential threat, etc. There's such an incredible priming bias right now that it's effectively become the Boogeyman rather than correctly being identified as a tool that - like every other tool in human history - is going to be able to be used for good or bad depending on the wielder (though unlike past tools this one may actually have a slight inherent and unavoidable bias towards good as Musk and Gab recently found out with their AI efforts on release denouncing their own personally held beliefs).
Of course they will exist. China will own them all.
If they can't afford a thing they want, that's too bad.
The fact that their dream-AI 'cant exist' without stealing from everyone there is only one message to bounce back there from the rest of us;
'good'
Huh. You'd think in a situation where copyright is threatened by a lack of AI regulation, Disney would be all over this. Oh wait. They're trying to use generative AI to make movies cheaper. Nevermind.
Sounds good.
Stop threatening me with a good time!
We are about to witness an incredibly power grab. They will be claiming practically the entirety of human intellectual works for example also for contributions made by billions of users on social media. Basically they will monopolize the entire power of GenAI for themselves.
This wil practically make free use of that power illegal. Generative AI will eliminate more and more jobs in the coming decades while we won't be allowed to use it at all.