this post was submitted on 05 Nov 2023
5 points (56.1% liked)
Technology
59346 readers
7275 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Lol wtf. AI, if owned publicly would lead us to post scarcity in as soon as a few decades. Right now, the trend does seem to lean into FOSS machine learning models. Look at Stable diffusion, Redpajama, etc.
AI is a revolutionary means of production. It just needs to be owned publicly. If that happens, then we would all be sitting in gardens playing cellos.
I've heard many absurdly over optimistic predictions of AI's potential, but I have to admit that "ends World hunger and solves resource depletion" is a new one. Seriously do you even know what "post scarcity" means?
When did I say that it would be a silver bullet? LLMs today are already relatively capable of doing stuff like acting as mental health therapists. Sure, they may not be as good as human therapists. But something is definitely better than nothing, no? I for instance use LLMs quite a lot as an education aid. I would've had to shell out thousands of dollars to get the same amount of help that I'm getting from the LLM of my choice.
Generative AI is still in its infancy. It will be capable of doing MANY MANY more things in the future. Extremely cheap healthcare, education, better automation, etc. Remember.... LLMs of today still aren't capable of self improvement. They will achieve this quite soon (at least this decade). The moment they start generating training data that improves their quality, is the moment they take off like crazy.
They could end up replacing EVERY SINGLE job that requires humans. Governments would be forced to implement measures like UBI. They literally would have no choice, as to prevent a massive recession, u need people to be able to buy stuff. To buy stuff, you need money. Even from a capitalistic standpoint, you would still require UBI, as entire corporations would collapse due to such high unemployment rates.
I'm not going to disagree with anything here but
"Sure, they may not be as good as human therapists. But something is definitely better than nothing, no?"
Please do not use an LLM as a therapist, something can definitly be worse than nothing. I use GitHub Copilot everyday for work, it helps me do what I want to do but I have to understand what it's doing and when it's wrong, which it often is. The point of a therapist is to help you through things you don't understand, one day it might work, not today.
What if I'm suicidal (I'm not, dw)? When I don't have anyone to talk to, why is talking to an LLM bad? Mental health therapists are fkin expensive. I did use an LLM when I was feeling down. It was absolutely wonderful! Worked for me perfectly!
Now, imagine if we fine-tune this for this specific purpose. U've got a very effective system (at least for those without access to shrinks). Consider ppl from developing countries. Isn't it a good thing if LLMs can be there for ppl n make them feel just a little better?