this post was submitted on 01 Aug 2023
613 points (91.6% liked)
Technology
59346 readers
7627 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Immorally then.
Illegally, maybe. Immorally, probably not. It’s fine for a human to read something and learn from it, so why not an algorithm? All of the original content is diluted into statistics so much that the source material does not exist in the model. They didn’t hack any databases, they merely use information that’s already available for anyone to read on the internet.
Honestly, the real problem is not that OpenAI learned from publicly available material, but that something trained on public material is privately owned.
Is that really a problem? Is a create something new based on public knowledge, should I not be able to profit from it?
I learn to paint from YouTube, should I paint for free now?
I'll admit that the scope of ChatGPT is MUCH bigger than one person painting.
I mean, that’s what I mean when I say it was a more controversial opinion. From a purist perspective I tend to believe that intellectual property in general is not ethical and stifles innovation.