this post was submitted on 03 Jan 2024
607 points (93.5% liked)
Technology
59467 readers
3687 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
I fail to see what he or your comment has to do with Generative AI models, which is what we are talking about.
I don't think you fully understand how Generative AIs work. The input data is used in a similar, but far more rudimentary way, to learn as humans do. The model itself contains no recognizable original data, just a bunch of numbers, math and weights in an attempt to simulate the neurons and synaptic pathways that our brains form when we learn things.
Yes, a carefully crafted prompt can get it to spit out a near identical copy of something it was trained on (assuming it had been trained on enough data of the target artist to begin with), but so can humans. In those cases humans have gotten in trouble when attempting to profit off it and therefore in that case justice must be served regardless of if it was AI or human that reproduced it.
But to use something that was publicly available on the Internet for input is fair game just as any human might look at a sampling of images to nail down a certain style. Humans are just far more efficient at it with far far less needed data
So does something become less plagiaristic if the plagiarizer can't provide attribution?
And no, AI does not learn, so you cannot compare it to a human. Passing the Chinese room test does not mean you are a native Chinese speaker.
Not all AIs do, the more "traditional" ones that you're probably thinking of don't. The ones that are generating text, images and video, however, are based on Generative Adversarial Networks a type of Deep learning Neural Network and those do learn albeit in a rudimentary fashion compared to humans, but learning none the less.
Whether attempting to oversimplify or anthropomorphize, no, computers do not learn. We are not, in fact, on the way to creating creativity.