this post was submitted on 03 Jan 2024
607 points (93.5% liked)

Technology

59467 readers
3687 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

Found it first here - https://mastodon.social/@BonehouseWasps/111692479718694120

Not sure if this is the right community to discuss here in Lemmy?

you are viewing a single comment's thread
view the rest of the comments
[–] cm0002@lemmy.world 1 points 10 months ago (1 children)

I fail to see what he or your comment has to do with Generative AI models, which is what we are talking about.

I don't think you fully understand how Generative AIs work. The input data is used in a similar, but far more rudimentary way, to learn as humans do. The model itself contains no recognizable original data, just a bunch of numbers, math and weights in an attempt to simulate the neurons and synaptic pathways that our brains form when we learn things.

Yes, a carefully crafted prompt can get it to spit out a near identical copy of something it was trained on (assuming it had been trained on enough data of the target artist to begin with), but so can humans. In those cases humans have gotten in trouble when attempting to profit off it and therefore in that case justice must be served regardless of if it was AI or human that reproduced it.

But to use something that was publicly available on the Internet for input is fair game just as any human might look at a sampling of images to nail down a certain style. Humans are just far more efficient at it with far far less needed data

[–] LWD@lemm.ee -1 points 10 months ago (1 children)

So does something become less plagiaristic if the plagiarizer can't provide attribution?

And no, AI does not learn, so you cannot compare it to a human. Passing the Chinese room test does not mean you are a native Chinese speaker.

[–] cm0002@lemmy.world 2 points 10 months ago (1 children)

Not all AIs do, the more "traditional" ones that you're probably thinking of don't. The ones that are generating text, images and video, however, are based on Generative Adversarial Networks a type of Deep learning Neural Network and those do learn albeit in a rudimentary fashion compared to humans, but learning none the less.

[–] LWD@lemm.ee 1 points 10 months ago* (last edited 10 months ago)

Whether attempting to oversimplify or anthropomorphize, no, computers do not learn. We are not, in fact, on the way to creating creativity.