this post was submitted on 12 Nov 2023
88 points (83.8% liked)
Asklemmy
43516 readers
2163 users here now
A loosely moderated place to ask open-ended questions
If your post meets the following criteria, it's welcome here!
- Open-ended question
- Not offensive: at this point, we do not have the bandwidth to moderate overtly political discussions. Assume best intent and be excellent to each other.
- Not regarding using or support for Lemmy: context, see the list of support communities and tools for finding communities below
- Not ad nauseam inducing: please make sure it is a question that would be new to most members
- An actual topic of discussion
Looking for support?
Looking for a community?
- Lemmyverse: community search
- sub.rehab: maps old subreddits to fediverse options, marks official as such
- !lemmy411@lemmy.ca: a community for finding communities
~Icon~ ~by~ ~@Double_A@discuss.tchncs.de~
founded 5 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Its a pretty good teacher because you can ask the same question over and over until you understand.
There are some limitations. I’ve asked some questions relating to my courses and it doesn’t always get it right.
The biggest issue imo is that due to the way it works, it never just says "i dont know, i dont have information on that topic". Instead it just makes something up.
Agreed, it’s always confident, but not always correct.
GPT4 or 3.5? What are you studying?
4; microecon. Often when asked to solve cost minimization problems with supplied info, it states the problem can’t be solved, and that there might be a problem with the questions.
Don't use GPT-3.5 for that. It'll halluzinate pretty quickly. GPT-4 is much better, but it's still wise to double-check what it says before you make life decisions.
Except it refuses to aknowledge wrong knowledge or impossible methods. Until you tell it so, at which point it will agree with you even if you are also objectively wrong to tell it that.
I've coherced some systems into doing things they were absolutely not initially meant to do. ChatGTP would straight up tell me wrong and misleading ways to achieve them, and would be too ignorant to realize with enough stubbornness and with existing obscure info and the right mindset, a correct one existed. And at no point, not even with prodding and handholding, would it be able to have a useful conversation surrounding the why's of the current state of art.
It's not an expert system and it's stupid as fuck to treat it as one.
I'm not asking what the answer is. I'm asking why things are done and the answer gives me a few different reasons why it is done. It's very helpful. The answers all seem reasonable and I've not experienced any dumb answers so far.
So long as you’re happy potentially getting the wrong answer over and over again.