this post was submitted on 21 Oct 2024
527 points (98.0% liked)

Facepalm

2639 readers
6 users here now

founded 1 year ago
MODERATORS
 
you are viewing a single comment's thread
view the rest of the comments
[–] Muffi@programming.dev 51 points 3 weeks ago (3 children)

I was having lunch at a restaurant a couple of months back, and overheard two women (~55 y/o) sitting behind me. One of them talked about how she used ChatGPT to decide if her partner was being unreasonable. I think this is only gonna get more normal.

[–] Wolf314159@startrek.website 43 points 3 weeks ago (1 children)

A decade ago she would have been seeking that validation from her friends. ChatGPT is just a validation machine, like an emotional vibrator.

[–] Trainguyrom@reddthat.com 14 points 3 weeks ago

The difference between asking a trusted friend for advice vs asking ChatGPT or even just Reddit is a trusted friend will have more historical context. They probably have met or at least interacted with the person in question, and they can bring i the context of how this person previously made you feel. They can help you figure out if you're just at a low point or if it's truly a bad situation to get out of.

Asking ChatGPT or Reddit is really like asking a Magic 8 Ball. How you frame the question and simply asking the question helps you interrogate your feelings and form new opinions about the situation, but the answers are pretty useless since there's no historical context to base the answers off of, plus the answers are only as good as the question asked.

[–] GreenKnight23@lemmy.world 6 points 3 weeks ago

I would rather it from a LLM over some dumb shit magazine quiz, and I fucking hate LLMs.

[–] orcrist@lemm.ee 3 points 3 weeks ago

I don't think people who think very much would bother to ask ChatGPT, unless they didn't have any friends, because it's quite obvious that relationship advice is delicate and you certainly want the advice giver to know something about your situation. You know, like your friends do, like computers don't.

We don't even have to look at the low quality advice, because there's no way it would be informed advice.