this post was submitted on 11 Oct 2023
477 points (92.5% liked)

Technology

59092 readers
6622 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] Smacks@lemmy.world 37 points 1 year ago (1 children)

AI is a tool to assist creators, not a full on replacement. Won't be long until they start shoving ads into Bard and ChatGPT.

[–] BeautifulMind@lemmy.world 5 points 1 year ago (1 children)

AI is a tool to ~~assist~~ plagiarize the work of creators

Fixed it

LOL OK it's a super-powerful technology that will one day generate tons of labor very quickly, but none of that changes that in order to train it to be able to do that, you have to feed it the work of actual creators- and for any of that to be cost-feasible, the creators can't be paid for their inputs.

The whole thing is predicated on unpaid labor, stolen property.

[–] 2ncs@lemmy.world 1 points 1 year ago (1 children)

At what line does it become stolen property? There are plenty of tools which artists use today that use AI. Those AI tools they are using are more than likely trained on some creation without payment. It seems the data it's using isn't deemed important enough for that to be an issue. Google has likely scraped billions of images from the Internet for training on Google Lens and there was not as much of an uproar.

Honestly, I'm just curious if there is an ethical line and where people think it should be.