this post was submitted on 26 Jul 2023
483 points (96.0% liked)

Technology

59346 readers
7412 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

Thousands of authors demand payment from AI companies for use of copyrighted works::Thousands of published authors are requesting payment from tech companies for the use of their copyrighted works in training artificial intelligence tools, marking the latest intellectual property critique to target AI development.

you are viewing a single comment's thread
view the rest of the comments
[–] bouncing@partizle.com 1 points 1 year ago (1 children)

It wouldn't matter, because derivative works require permission. But I don't think anyone's really made a compelling case that OpenAI is actually making directly derivative work.

The stronger argument is that LLM's are making transformational work, which is normally fair use, but should still require some form of compensation given the scale of it.

[–] jecxjo@midwest.social 1 points 1 year ago (1 children)

But no one is complaining about publishing derived work. The issue is that "the robot brain has full copies of my text and anything it creates 'cannot be transformative'". This doesn't make sense to me because my brain made a copy of your book too, its just really lossy.

I think right now we have definitions for the types of works that only loosely fit human actions mostly because we make poor assumptions of how the human brain works. We often look at intent as a guide which doesn't always work in an AI scenario.

[–] bouncing@partizle.com 1 points 1 year ago (1 children)

Yeah, that's basically it.

But I think what's getting overlooked in this conversation is that it probably doesn't matter whether it's AI or not. Either new content is derivative or it isn't. That's true whether you wrote it or an AI wrote it.

[–] jecxjo@midwest.social 1 points 1 year ago

I agree with that, but do politicians and judges who know absolutely nothing about the subject?

I haf a professor in college who taught about cyber security. He was renowned in his field and was asked by the RIAA to testify about some cases related to file sharing. I lost respect for him when he intentionally refrained from stating that it wasnt possible for anyone outside of the home network yo know what or who was actually downloading stuff. The technology was being ignored and an invalid view was presented for a judge who couldn't ELI5 how the internet worked let along actually networking protocols.