this post was submitted on 24 Feb 2025
79 points (100.0% liked)
PC Gaming
10156 readers
832 users here now
For PC gaming news and discussion. PCGamingWiki
Rules:
- Be Respectful.
- No Spam or Porn.
- No Advertising.
- No Memes.
- No Tech Support.
- No questions about buying/building computers.
- No game suggestions, friend requests, surveys, or begging.
- No Let's Plays, streams, highlight reels/montages, random videos or shorts.
- No off-topic posts/comments, within reason.
- Use the original source, no clickbait titles, no duplicates. (Submissions should be from the original source if possible, unless from paywalled or non-english sources. If the title is clickbait or lacks context you may lightly edit the title.)
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Is it? Or is it for companies to not have to pay out salaries so they increase profits for AI-generated work, regardless if the AI is sentient or not?
~This~ ~comment~ ~is~ ~licensed~ ~under~ ~CC~ ~BY-NC-SA~ ~4.0~
Cells within cells.
Interlinked.
This post is unsettling. While LLMs definitely aren't reasoning entities, the point is absolutely bang on...
But at the same time feels like a comment from a bot.
Is this a bot?
Clearly. Sentience would imply some sense of internal thought or self awareness, an ability to feel something ...so LLMs are better since they're just machines. Though I'm sure they'd have no qualms with driving slaves.
I'm not talking about sentience per se, but how any "AI" would think, lookups (LLMs), vs synthesized on-the-fly thinking (mimicing the human brain's procesing).
~This~ ~comment~ ~is~ ~licensed~ ~under~ ~CC~ ~BY-NC-SA~ ~4.0~