this post was submitted on 18 Aug 2023
-6 points (41.2% liked)
Technology
59467 readers
4449 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
It actually is biased though. UpperEchelon did a video exposing this. I swing to the left myself, but I would prefer if the LLMs were objective.
Political objectivity is impossible.
I would argue that asking a machine to list known information is not impossible.
Here's a very clear example where chatGPT refused to answer a question regarding Biden but happily answered the exact same question for Trump.
https://youtu.be/_Klkr6PtYzI?t=520
And before anyone starts, NO! I'm not a supporter of the oompaloompa king.
Mhm, but with the way LLMs work, it's not possible to actually remove bias since it's baked into the training data. Any adjustment towards "neutral" would be biased by what the adjuster considers neutral.
Here is an alternative Piped link(s): https://piped.video/_Klkr6PtYzI?t=520
Piped is a privacy-respecting open-source alternative frontend to YouTube.
I'm open-source, check me out at GitHub.
Only if emotions are involved. Of course it's not possible as long as we train our AI with flawed human-generated data though.