this post was submitted on 02 Aug 2023
334 points (93.9% liked)
Technology
59261 readers
2797 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Humans can recognize and account for their own hallucinations. LLMs can't and never will.
It's pretty ironic that you say they "never will" in this context.
They can't... Most people strongly believe they know many things while they have no idea what they are talking about. Most known cases are flat earthers, qanon, no-vax.
But all of us are absolutely convinced we know something until we found out we don't.
That's why double blind tests exists, why memories are not always trusted in trials, why Twitter is such an awful place