this post was submitted on 23 Nov 2023
18 points (60.5% liked)

Ask Lemmy

26988 readers
2045 users here now

A Fediverse community for open-ended, thought provoking questions

Please don't post about US Politics. If you need to do this, try !politicaldiscussion@lemmy.world


Rules: (interactive)


1) Be nice and; have funDoxxing, trolling, sealioning, racism, and toxicity are not welcomed in AskLemmy. Remember what your mother said: if you can't say something nice, don't say anything at all. In addition, the site-wide Lemmy.world terms of service also apply here. Please familiarize yourself with them


2) All posts must end with a '?'This is sort of like Jeopardy. Please phrase all post titles in the form of a proper question ending with ?


3) No spamPlease do not flood the community with nonsense. Actual suspected spammers will be banned on site. No astroturfing.


4) NSFW is okay, within reasonJust remember to tag posts with either a content warning or a [NSFW] tag. Overtly sexual posts are not allowed, please direct them to either !asklemmyafterdark@lemmy.world or !asklemmynsfw@lemmynsfw.com. NSFW comments should be restricted to posts tagged [NSFW].


5) This is not a support community.
It is not a place for 'how do I?', type questions. If you have any questions regarding the site itself or would like to report a community, please direct them to Lemmy.world Support or email info@lemmy.world. For other questions check our partnered communities list, or use the search function.


Reminder: The terms of service apply here too.

Partnered Communities:

Tech Support

No Stupid Questions

You Should Know

Reddit

Jokes

Ask Ouija


Logo design credit goes to: tubbadu


founded 1 year ago
MODERATORS
 

I know it's not even close there yet. It can tell you to kill yourself or to kill a president. But what about when I finish school in like 7 years? Who would pay for a therapist or a psychologist when you can ask for help a floating head on your computer?

You might think this is a stupid and irrational question. "There is no way AI will do psychology well, ever." But I think in today's day and age it's pretty fair to ask when you are deciding about your future.

top 50 comments
sorted by: hot top controversial new old
[–] MagneticFusion@lemm.ee 28 points 1 year ago (1 children)

A computer will never have emotions the same way a human has emotions. It is not a living creature. True and genuine human connection is something that will only become more invaluable with the rise of AI

load more comments (1 replies)
[–] Havald@lemmy.world 22 points 1 year ago

I won't trust a tech company with my most intimate secrets. Human therapists won't get fully replaced by ai

[–] TimewornTraveler@lemm.ee 22 points 1 year ago (5 children)

homie lemme let you in on a secret that shouldn't be secret

in therapy, 40% of positive client outcomes come from external factors changing

10% come from my efforts

10% come from their efforts

and the last 40% comes from the therapeutic alliance itself

people heal through the relationship they have with their counselor

not a fucking machine

this field ain't going anywhere, not any time soon. not until we have fully sentient general ai with human rights and shit

[–] cheese_greater@lemmy.world 3 points 1 year ago* (last edited 1 year ago)

I don't think there's harm in allowing people who would never be able to afford life-saving medicine to have life-saving medicine cat-puzzle-feeder style

Edit: this was me and access hasn't changed the fact that I do no generally derive value from it.

load more comments (4 replies)
[–] Macaroni_ninja@lemmy.world 19 points 1 year ago (3 children)

I don't think the AI everyone is so buzzed about today is really a true AI. As someone summed it up: it's more like a great autocomplete feature but it's not great at understanding things.

It will be great to replace Siri and the Google assistant but not at giving people professional advice by a long shot.

load more comments (3 replies)
[–] Bonifratz@feddit.de 16 points 1 year ago

Even if AI did make psychology redundant in a couple of years (which I'd bet my favourite blanket it won't), what are the alternatives? If AI can take over a field that is focused more than most others on human interaction, personal privacy, thoughts, feelings, and individual perceptions, then it can take over almost any other field before that. So you might as well go for it while you can.

[–] magnetosphere@kbin.social 16 points 1 year ago (2 children)

You are putting WAY too much faith in the ability of programmers. Real AI that can do the job of a therapist is decades away, at least - and then there’s the approval process, which will take years all by itself. Don’t underestimate that. AI therapy is uncharted territory, and the approval process will be lengthy, detailed, and incredibly strict.

Lastly, there’s public acceptance. Even if AI turns out to have measurably better outcomes, if people aren’t comfortable with it, statistics won’t matter. People aren’t rational. I don’t care how “good” Alexa is, or how much evidence you show me - I will never accept that a piece of software can understand what it’s like to grow up as a person. I want to talk about my issues with a flawed, fallible human, not a box plugged into the wall.

You ask a valid question, just much earlier than necessary. I’d be surprised if AI was a viable alternative by the time you retire.

[–] intensely_human@lemm.ee 2 points 1 year ago

Dr Sbaitso was proven to be clinically effective in the 1980s.

[–] Encode1307@lemm.ee 1 points 1 year ago

There are already digital therapeutic platforms approved for mental health. Orexo deprexis is one such program. The fact is that the vast majority of people who need therapy aren't getting it now. These ai therapy models will provide services to those people. I'm willing to bet that in a decade, the majority of therapy will be done by AI, with human therapists focused on the most severe behavioral health conditions.

[–] nottheengineer@feddit.de 15 points 1 year ago (2 children)

It's just like with programming: The people who are scared of AI taking their jobs are usually bad at them.

AI is incredibly good at regurgitating information and translation, but not at understanding. Programming can be viewed as translation, so they are good at it. LLMs on their own won't become much better in terms of understanding, we're at a point where they are already trained on all the good data from the internet. Now we're starting to let AIs collect data directly from the world (chatGPT being public is just a play to collect more data), but that's much slower.

[–] Cossty@lemmy.world 3 points 1 year ago

I am not a psychologist yet. I only have a basic understanding of the job description but it is a field that I would like to get into.

I guess you are right. If you are good at your job, people will find you just like with most professions.

[–] Nibodhika@lemmy.world 2 points 1 year ago

I slightly disagree, in general I think you're on point, but artists specially are actually being fired and replaced by AI, and that trend will continue untill there's a major lawsuit because someone used a trademarked thing from another company.

[–] snek@lemmy.world 13 points 1 year ago

No, it won't. I don't think I would have made it here today alive without my therapist. There may be companies that have AI agents doing therapy sessions but your qualifications will still be priceless and more effective in comparison.

[–] scorpionix@feddit.de 10 points 1 year ago (2 children)

Given how little we know about the inner workings of the brain (I'm a materialist, so to me the mind is the result of processes in the brain), I think there is still ample room for human intuition in therapy. Also, I believe there will always be people who prefer talking to a human over a machine.

Think about it this way: Yes, most of our furniture is mass-produced by IKEA and others like it, but there are still very successful carpenters out there making beautiful furniture for people.

[–] Cossty@lemmy.world 2 points 1 year ago

That's a fair point.

load more comments (1 replies)
[–] halcyondays@midwest.social 8 points 1 year ago (1 children)

20 years ago the line was “there are no careers in psychology/philosophy”. So I got a comp sci degree, and I do well enough coding, but I could probably be happier with how I spend my days. I still read philosophy in my free time. Less tangible paths have always been demonized, largely because society needs a lot of laborers and engineers, and fewer thinkers and theorists. The potential of AI is just the latest buzzword applied to a century old coercion tactic.

That said, if we entertain the possibility, I think you’re taking too narrow of a view of the possibilities. Who will advise the training of those therapy AI models? Doctorate psychologists.

I work for an education tech company, obviously our product is built by an engineering team of comp sci majors that know how to code - but we employ a large number of former teachers and folks with pedagogical degrees to guide how the product actually works in the real world.

The same will continue to be true for future products, a model to perform a task well doesn’t exist without those that deeply understand the task at hand.

Another example that comes to mind is data science - has any economist ever recommended a theoretical math degree as a career choice? And yet every company racing to implement the latest machine learning models now needs someone that understands Bayesian probability networks and Markov chains. Suddenly a “useless” degree is in high demand.

If that’s what you want to do, I think you’ll find your way. Minor in comp sci and think about how to implement your psychology learnings in code, if you want to have a contingency plan.

[–] Cossty@lemmy.world 3 points 1 year ago* (last edited 1 year ago)

That's great answer. Thank you.

[–] Nonameuser678@aussie.zone 8 points 1 year ago

Psychotherapy is about building a working relationship. Transference is a big part of this relationship. I don't feel like I'd be able to build the same kind of therapeutic relationship with an AI that I would with another human. That doesn't mean AI can't be a therapeutic tool. I can see how it could be beneficial with things like positive affirmations and disrupting negative thinking patterns. But this wouldn't be a substitute for psychotherapy, just a tool for enhancing it.

[–] 4am@lemm.ee 8 points 1 year ago (1 children)

AI cannot think, it does not logic or reason. It outputs a result from an input prompt. That will not solve psychological problems.

[–] baked_tea@lemmy.world 1 points 1 year ago

It's what AI does at the moment. Which may not necessarily be true in a few years, what's what OP is asking about.

[–] hugz@kbin.social 7 points 1 year ago

The caring professions are often considered to be among the safest professions. "Human touch" is very important in therapy

[–] NMS@startrek.website 6 points 1 year ago

Hey, maybe your back ground in psychology will help with unfucking an errant LLM or actual AI someday :P

[–] DABDA@lemmy.world 6 points 1 year ago (2 children)

All my points have already been (better) covered by others in the time it took me to type them, but instead of deleting will post anyway :)


If your concerns are about AI replacing therapists & psychologists why wouldn't that same worry apply to literally anything else you might want to pursue? Ostensibly anything physical can already be automated so that would remove "blue-collar" trades and now that there's significant progress into creative/"white-collar" sectors that would mean the end of everything else.

Why carve wood sculptures when a CNC machine can do it faster & better? Why learn to write poetry when there's LLMs?

Even if there was a perfect recreation of their appearance and mannerisms, voice, smell, and all the rest -- would a synthetic version of someone you love be equally as important to you? I suspect there will always be a place and need for authentic human experience/output even as technology constantly improves.

With therapy specifically there's probably going to be elements that an AI can [semi-]uniquely deal with just because a person might not feel comfortable being completely candid with another human; I believe that's what using puppets or animals or whatever to act as an intermediary are for. Supposedly even a really basic thing like ELIZA was able convince some people it was intelligent and they opened up to it and possibly found some relief from it, and there's nothing in it close to what is currently possible with AI. I can envision a scenario in the future where a person just needs to vent and having a floating head just compassionately listen and offer suggestions will be enough; but I think most(?) people would prefer/need an actual human when the stakes are higher than that -- otherwise the suicide hotlines would already just be pre-recorded positive affirmation messages.

[–] Cossty@lemmy.world 3 points 1 year ago

You still had some good/new points in last paragraph. Thx

[–] user224@lemmy.sdf.org 1 points 1 year ago

By the way, if you want to try Eliza, you can telnet into telehack.com and run the command eliza to launch it.

[–] cooopsspace@infosec.pub 5 points 1 year ago* (last edited 1 year ago) (1 children)

Given the vast array of existing pitfalls in AI, not to mention the outright biases and absence of facts - AI psychology would be deeply flawed and would more likely kill people.

Person: I'm having unaliving thoughts, I feel like it's the only thing I can do

AI: Ok do it then

That alone is why it'll never happen.

Also we need to sort out how to house, heal and feed our people before we start going and replacing masses of workforce.

[–] conciselyverbose@kbin.social 4 points 1 year ago

The level of liability you'd expose yourself actively advertising it as some sort of mental health product is insane.

I do believe someone will be dumb enough, but it's a truly terrible, insanely unsafe idea with anything resembling current tech in any way.

[–] Evilschnuff@feddit.de 5 points 1 year ago (4 children)

There is the theory that most therapy methods work by building a healthy relationship with the therapist and using that for growth since it’s more reliable than the ones that caused the issues in the first place. As others have said, I don’t believe that a machine has this capability simply by being too different. It’s an embodiment problem.

load more comments (4 replies)
[–] livus@kbin.social 5 points 1 year ago (1 children)

If you have a talk with the AI called Pi, it talks like a therapist. It's impressive at first but you can't escape the knowledge that it dgaf about you.

And that's a trait people really don't want in a therapist.

[–] ThankYouVeryMuch@kbin.social 1 points 1 year ago (1 children)

Yeah for $100 an hour many people would give a fuck about you

[–] rynzcycle@kbin.social 4 points 1 year ago

You jest, but honestly this is what helped me. I felt very alone, deeply depressed and held a long rooted belief that I wasn't important enough to deserve better.

Knowing that this person was listening because they were being paid/it was their job, helped be get past the guilt and open up. Likely saved my life. AI would not have given me that.

[–] Addition@sh.itjust.works 5 points 1 year ago

Here's a case study for you: An eating disorder hotline got rid of the humans in favor of an AI chatbot. Lasted less than a week before it was giving horrible advice.

https://www.theguardian.com/technology/2023/may/31/eating-disorder-hotline-union-ai-chatbot-harm

Psychology will be controlled by humans, probably forever.

I think it is one of these things that AI can't make redundant, never.

[–] Zeth0s@lemmy.world 5 points 1 year ago* (last edited 1 year ago)

AI won't do psychology redundant. Might allow for an easier and broader access to low level psychological first support.

What is more likely to make psychological consultants a risky investment is the economic crisis. People are already prioritizing food over psychological therapy. Psychological therapy unfortunately is nowadays a "luxury item".

[–] jabathekek@sopuli.xyz 4 points 1 year ago

I don't think many people would want to seek psychiatric care from what they might see as a computer. A large part of clinical psychology is creating and maintaining a relationship with patients and I highly doubt language models will become sophisticated enough to achieve that in seven years, if at all. Remember these aren't true AI's, they are language models. They have a long way to go before they can be seen as true intelligences.

[–] theherk@lemmy.world 3 points 1 year ago

Many valid points here, but here is a slightly different perspective. Let’s say for the sake of discussion AI is somehow disruptive here. So?

You cannot predict what will happen in this very fast space. You should not attempt to do so in a way that compromises your path toward your interests.

If you like accounting or art or anything else that AI may disrupt… so what? Do it because you are interested. It may be hyper important to have people that did so in any given field no matter how unexpected. And most importantly, doing what interest you is always at least part of a good plan.

[–] realharo@lemm.ee 3 points 1 year ago* (last edited 1 year ago)

It's definitely possible, but such an AI would probably be good enough to take over every other field too. So it's not like you can avoid it by choosing something else anyway.

And the disruption would be large enough that governments will have to react in some way.

[–] lvxferre@lemmy.ml 2 points 1 year ago (1 children)

If you're going to avoid psychology, do it because of the replication crisis. What is being called "AI" should play no role on that. Here's why.

Let us suppose for a moment that some AI 7y from now is able to accurately diagnose and treat psychological issues that someone might have. Even then the AI in question is not a moral agent that can be held responsible for its actions, and that is essential when you're dealing with human lives. In other words you'll still need psychologists picking the output of said AI and making informed decisions on what the patient should [not] do.

Furthermore, I do not think that those "AI systems" will be remotely as proficient at human tasks in, say, a decade, as some people are claiming that they will be. AI is a misnomer, those systems are not intelligent. Model-based text generators are a great example of that (and relevant in your case): play a bit with ChatGPT or Bard, and look at their output in a somewhat consistent way (without cherry picking the hits and ignoring the misses). Then you'll notice that they don't really understand anything - they're reproducing grammatical patterns regardless of their content. (Just like they were programmed to.)

[–] Cossty@lemmy.world 5 points 1 year ago* (last edited 1 year ago) (1 children)

I haven't heard of Replication Crisis. Thanks for pointing that out.

[–] lvxferre@lemmy.ml 2 points 1 year ago

It boils down to scientists not knowing if they're actually reaching some conclusion or just making shit up. It's actually a big concern across multiple sciences, it's just that Psychology is being hit really hard, and for clinical psychologists this means that they simply can't trust as much the theoretical frameworks guiding their decisions as they were supposed to.

[–] FaceDeer@kbin.social 2 points 1 year ago

Well, I won't say I think there's no risk at all. AI is advancing rapidly and in very surprising ways. But I expect that most of the jobs that AI is currently "replacing" will actually still survive in some related form. When sewing machines were invented it didn't poof tailors out of existence, they started doing other things. The invention allowed people to be able to own way more clothing than they did before, so fashion design became a bigger thing. Etc.

Even if AIs get really good at psychology there'll still be people who are best handled by a human. Heck, you might end up with an AI "boss" that decides which cases those would be and give you suggestions on how to handle them, but your own training will likely still be useful.

If you want to be really future-proof then make sure to set aside some savings and think about alternate careers that you might enjoy keeping abreast of as hobbies just in case something truly drastic happens to your primary field.

[–] 0x4E4F@infosec.pub 2 points 1 year ago* (last edited 1 year ago)

I seriosly think that a psychologist or a therapist would be one of the few jobs that will never get replaced by AI... or at least not in the near future (10 years or so).

Though the question is valid, I would agree.

[–] dumples@kbin.social 2 points 1 year ago

At the end of the day AI (no just the LLM we call AI now) are really good at doing boring machine work. These tasks are repetitive, simple and routine. This includes all the LLM which can summarize boring text and generate more boring text. It can't generate anything new but just output and rearrange.

What there will be always need for are human work. This includes creativity, emotions and human interaction. A machine can't replace that at all. Psychology and therapy are all emotions and human interactions so it might be the most safe career choice. Same with something like haircutting or other career that involve human wisdom and personal skills.

Boring jobs like sending and receiving emails might be replaced. The reason businesses are so scared is that the majority of people in an office just do that

[–] troed@fedia.io 1 points 1 year ago

Tomorrow's psychologists will be the ones to "program" AIs. It will be a very important profession.

[–] SHamblingSHapes@lemmy.one 1 points 1 year ago

They already do have AI therapy assistants. CBT type therapy is particularly easy to turn into an app. There are half a dozen in the Google Play store now. They're a nice reminder at times, but no substitute for human conversation.

Once we do have AIs capable of conversation indistinguishable from real human, then therapy is not the only job that will be disrupted. Therapy will be no more or less safe a career path than so many other things.

Second, humans will still need to program, train, and monitor the therapy AIs. The obvious candidates to fill the role at first are experienced therapists with a bit of tech savvy. Until they optimize to the point where the job can be done by warm bodies paid minimum wage, probably "contractors" so liability can be compartmentalized. Then we're back to the point above where everyone in any career is fucked anyway, might as well do what you're good at and what you enjoy for a decade or two.

[–] Encode1307@lemm.ee 1 points 1 year ago

Most basic therapy dealing with relatively simple problems like mild to moderate depression and anxiety will likely be pretty responsive to AI based treatment, but people with serious and persistent mental illness will still need therapists.

load more comments
view more: next ›