this post was submitted on 14 Jul 2024
352 points (94.4% liked)

No Stupid Questions

35696 readers
1334 users here now

No such thing. Ask away!

!nostupidquestions is a community dedicated to being helpful and answering each others' questions on various topics.

The rules for posting and commenting, besides the rules defined here for lemmy.world, are as follows:

Rules (interactive)


Rule 1- All posts must be legitimate questions. All post titles must include a question.

All posts must be legitimate questions, and all post titles must include a question. Questions that are joke or trolling questions, memes, song lyrics as title, etc. are not allowed here. See Rule 6 for all exceptions.



Rule 2- Your question subject cannot be illegal or NSFW material.

Your question subject cannot be illegal or NSFW material. You will be warned first, banned second.



Rule 3- Do not seek mental, medical and professional help here.

Do not seek mental, medical and professional help here. Breaking this rule will not get you or your post removed, but it will put you at risk, and possibly in danger.



Rule 4- No self promotion or upvote-farming of any kind.

That's it.



Rule 5- No baiting or sealioning or promoting an agenda.

Questions which, instead of being of an innocuous nature, are specifically intended (based on reports and in the opinion of our crack moderation team) to bait users into ideological wars on charged political topics will be removed and the authors warned - or banned - depending on severity.



Rule 6- Regarding META posts and joke questions.

Provided it is about the community itself, you may post non-question posts using the [META] tag on your post title.

On fridays, you are allowed to post meme and troll questions, on the condition that it's in text format only, and conforms with our other rules. These posts MUST include the [NSQ Friday] tag in their title.

If you post a serious question on friday and are looking only for legitimate answers, then please include the [Serious] tag on your post. Irrelevant replies will then be removed by moderators.



Rule 7- You can't intentionally annoy, mock, or harass other members.

If you intentionally annoy, mock, harass, or discriminate against any individual member, you will be removed.

Likewise, if you are a member, sympathiser or a resemblant of a movement that is known to largely hate, mock, discriminate against, and/or want to take lives of a group of people, and you were provably vocal about your hate, then you will be banned on sight.



Rule 8- All comments should try to stay relevant to their parent content.



Rule 9- Reposts from other platforms are not allowed.

Let everyone have their own content.



Rule 10- Majority of bots aren't allowed to participate here.



Credits

Our breathtaking icon was bestowed upon us by @Cevilia!

The greatest banner of all time: by @TheOneWithTheHair!

founded 1 year ago
MODERATORS
top 50 comments
sorted by: hot top controversial new old
[–] slazer2au@lemmy.world 269 points 3 months ago (8 children)

That seems like a Q3 issue for 2026 let's put the conversation off till then.

/s

[–] jballs@sh.itjust.works 28 points 3 months ago

Q3 2026 will come around and the AI will report that revenues are down. The CEO will respond the only way they know, by ordering that costs be cut by laying off employees. The AI will report there is no one left to lay off but the CEO.

Fade to black and credits roll.

load more comments (7 replies)
[–] TheFeatureCreature@lemmy.world 215 points 3 months ago (4 children)

Capitalism is all about short-term profit. These sorts of long-term questions and concerns are not things shareholders and investors think or care about.

Further proof of this: Climate change.

[–] BlackLaZoR@kbin.run 22 points 3 months ago (9 children)

Funny thing is that capitalism accidentaly solves global warming same way as it created it - turns out renewables are cheaper than fossil fuel, and the greed machine ensures the transition to more cost efficient energy sources

[–] tate@lemmy.sdf.org 46 points 3 months ago (3 children)

It's a hopeful idea, but it may be too late.

[–] Bronzie@sh.itjust.works 23 points 3 months ago (1 children)

Should not stop us from trying though

load more comments (1 replies)
load more comments (2 replies)
[–] Pelicanen@sopuli.xyz 27 points 3 months ago (4 children)

The problem is that the previous accumulation of capital has centralized a lot of power in actors who have a financial incentive to stop renewables. If we could hit a big reset on everything then yes, I think renewables would win, but we're dealing with a lot of very rich, very powerful people who really want us to keep being dependent on them.

load more comments (4 replies)
load more comments (7 replies)
load more comments (3 replies)
[–] Damage@feddit.it 77 points 3 months ago (2 children)

Pathogens don't really think of what will happen after the body they're abusing dies

load more comments (2 replies)
[–] s38b35M5@lemmy.world 58 points 3 months ago (1 children)

Back in the 1980's they told me it'd trickle down.

...eventually.

[–] FlyingSquid@lemmy.world 21 points 3 months ago (4 children)

They were actually talking about how they were pissing on the living room floor while we're in the basement.

load more comments (4 replies)
[–] morphballganon@lemmy.world 39 points 3 months ago

Don't think of people having money as an on-off switch. It's a gradual shift, and it's already started, before AI was a thing. AI is just another tool to increase the wealth gap, like inflation, poor education, eroding of human rights etc.

[–] someguy3@lemmy.ca 31 points 3 months ago* (last edited 3 months ago)

Capitalism doesn't look that far ahead.

I agree it's going to be problem. It's already happened when we exported manufacturing jobs to China. Most of what was left was retail which didn't pay as much but we struggled along (in part because of cheap products from China). I think that's why trinkets are cheap but the core of living (housing and now food) is relatively more expensive. So the older people see all the trinkets (things that used to be expensive but are now cheap) and don't understand how life is more expensive.

[–] Zahtu@feddit.org 29 points 3 months ago (6 children)

Ever heard of the everlasting sustainable war? https://ghostintheshell.fandom.com/wiki/Sustainable_War

If robots generate all of productivity and human labor is no longer needed, the economy would not be able to sustain itself. Instead, in trying to cope with the unneeded human labor and to ensure continued productivity, the only area where productivity would be ensured is by means of war using human resources, namely destroying things in order to be rebuild, thus generating a sustaining feedback loop. The rich will get richer and everyone else will only be employed as soldiers in a continuing war economy.

Even though this is a sci-fi concept, i believe it's not a stretch to say we are headed to this direction.

load more comments (6 replies)
[–] Danquebec@sh.itjust.works 28 points 3 months ago* (last edited 3 months ago) (1 children)

This is a common question in economics.

It's called technological unemploymemt and it's a type of structural unemployment.

Economists generally believe that this is temporary. Workers will take new jobs that are now available or learn new skills to do so.

An example is how most of the population were farmers, before the agricultural revolution ans the industrial revolution. Efficiency improvements to agriculture happened, and now there's like only about 1% of the population in agriculture. Yet, most people are not unemployed.

There was also a time in England when a large part of the population were coal miners. Same story.

Each economic and technological improvement expands the economy, which creates new jobs.

There's been an argument by some, Ray Kurzweil if I remember correctly, but others as well, that we will eventually reach a point where humans are obsolete. There was a time when we used horses as the main mode of land transportation. Now, this is very marginal, and we use horses for a few other things, but really there's not that much use for them. Not as much as before. The same might happen to humans. Machines might become better than humans, for everything.

Another problem that might be happening is that the rate of technological change might be too fast for society to adapt, leaving us with an ever larger structural unemployment.

One of the solution that has been suggested is providing a basic income to everyone, so that losing your job isn't as much of a big problem, and would leave you time to find another job or learn a new skill to do so.

[–] barsquid@lemmy.world 17 points 3 months ago (1 children)

A major problem is all the money from these increases in efficiency go to a handful of people, who then hoard it. A market economy cannot work with hoarding, the money needs to circulate.

load more comments (1 replies)
[–] maynarkh@feddit.nl 26 points 3 months ago (1 children)

The rich. Companies will stop targeting products to wider and wider swathes of people, just like nobody caters to the homeless now.

[–] SkyNTP@lemmy.ml 14 points 3 months ago (1 children)

This doesn't sound sustainable at all. A billionaire only needs so much gasoline, food, medicine, TVs...

Collapse of entire industries will happen way before we even get a chance to see industries reinvent themselves to cater to billionaires. Don't believe me? Just look at what happened to the economy during the pandemic.

[–] maynarkh@feddit.nl 10 points 3 months ago

Yeah of course industries will collapse. 100 car factories will close, 5 superyacht factories will open, tying up the same amount of productivity. Owned by the same guy.

There will be tons of spacecraft launchpads, private jet hangars, etc.

And wars of course.

[–] ZILtoid1991@lemmy.world 23 points 3 months ago (6 children)

In theory, UBI.

In practice, it will likely lead to periodic job market crashes due to overapplying to the remaining jobs, and possibly even revolts.

If AI is really as good as its evangelists claim, and the technology ceiling will rise enough. IMHO, even the LLM technologies are getting exhausted, so it's not just a training data problem, of which these AI evangelists littered the internet with, so they will have a very hard time going forward.

load more comments (6 replies)
[–] KingThrillgore@lemmy.ml 22 points 3 months ago* (last edited 3 months ago) (1 children)

They won't, they'll simply die and the market will slowly adjust to those with capital.

This is all happening because we shot a gorilla in 2016 btw.

load more comments (1 replies)
[–] Duamerthrax@lemmy.world 21 points 3 months ago

Corporations, especially publicly traded ones, can't think past their quarterly reports. The ones that are private are competing with the public ones and think following trends by companies that are "too big to fail" will work out for them.

[–] EABOD25@lemm.ee 20 points 3 months ago (21 children)

I'm an optimist, so I'll believe one day we'll have a utopian society like in Star Trek. I ask politely you don't criticize me too harshly

[–] zephr_c@lemm.ee 13 points 3 months ago (1 children)

Hey, that's a reasonable thing to hope. The flip side, of course, is that I'm hoping I don't have to live through Star Trek's idea of how the 21st century goes. They definitely got all of the details wrong, but I'm afraid the vibes are matching a little too well.

load more comments (1 replies)
load more comments (20 replies)
[–] daniskarma@lemmy.dbzer0.com 19 points 3 months ago (8 children)

In a better world machines would do the work and humans just would share the wealth and live life in peace.

load more comments (8 replies)
[–] marcos@lemmy.world 17 points 3 months ago

AI owners will.

And if you then go around wandering "oh, but not every AI builds something those few people want", "that's way too few people to fill a market", or "and what about all the rest?"... Maybe you should read Keynes, because that would not be the first time this kind of buying-power change happens, and yes, it always suck a lot for everybody (even for the rich people).

[–] sundray@lemmus.org 16 points 3 months ago* (last edited 3 months ago) (2 children)

As stated, the companies that push AI aren't concerned with the long-term consequences. But if you want to know how the individuals who run those companies personally feel, do a search for billionaire doomsday preppers.

TL;DR: They've got a vision for the future. We're not in it.

load more comments (2 replies)
[–] soratoyuki@lemmy.world 16 points 3 months ago (7 children)

The vanishingly small amount of people that will be unfathomably rich in a privatized post-scarcity economy will give us just enough in UBI to make sure we can buy our Mountain Dew verification cans. And without the ability to withhold our labor as a class, we'll have no peaceful avenue to improve our conditions.

load more comments (7 replies)
[–] howrar@lemmy.ca 15 points 3 months ago (6 children)

Why would we need anyone to buy things? Remember that money is an abstraction for resources. If you can do everything with AI, then you already have all the resources you need. Whether or not someone else needs what you produce is irrelevant when you already have access to everything you could want.

load more comments (6 replies)
[–] Buddahriffic@lemmy.world 15 points 3 months ago (3 children)

I see three possibilities if AI is able to eliminate a significant portion of jobs:

  1. Universal basic income, that pays out based on how productive the provider side was per person. Some portion of wealth is continually transferred to the owners.
  2. Neofeudalism, where the owners at the time of transition end up owning everything and allow people to live or not live on their land at their whim. Then they can use them for labour where needed or entertainment otherwise. Some benevolent feudal lords might generally let people live how they want, though there will always be a fear of a revolution so other more authoritarian lords might sabotage or directly war with them.
  3. Large portions of the population are left SOL to die or do whatever while the economy doesn't care for them. Would probably get pretty violent since people don't generally just go off to die of starvation quietly. The main question for me is if the violence would start when the starving masses have had enough of it or earlier by those who see that coming.

I'm guessing reality will have some combination of each of those.

[–] DragonTypeWyvern@midwest.social 10 points 3 months ago (2 children)

If ONLY some smart fella had pushed a theory about collective ownership of the means of production or something

load more comments (2 replies)
load more comments (2 replies)
[–] NeptuneOrbit@lemmy.world 14 points 3 months ago (4 children)

That's the neat part. No one.

If the rich can hire a handful of the middle class to build and maintain their robots, then they can just cut the poor and working poor out of the economy entirely, and they will be willing to accept any conditions for food and shelter.

We can arrange the economy anyway we choose. Taking all of the decision making for themselves is part of the plan.

load more comments (4 replies)
[–] Arn_Thor@feddit.uk 14 points 3 months ago

That, my friend, is the problem for whichever schmuck is in charge after me, a C Suite executive. By then I will be long gone on my private island, having pulled the rip cord on my golden parachute.

[–] Dagwood222@lemm.ee 14 points 3 months ago (1 children)

Look at empires of the past.

Things were so bad in Dickens' London that living in sewers to live off whatever scraps you could find was an actual occupation.

Wealth creates its own reality.

load more comments (1 replies)
[–] markr@lemmy.world 14 points 3 months ago (3 children)

Everyone will be working multiple shitty service jobs that robots are not cost effective to automate. Our miserable wages will be just sufficient to keep the wheels on the cart from falling off.

load more comments (3 replies)
[–] Sgt_choke_n_stroke@lemmy.world 12 points 3 months ago

You're describing the end goal of monopoly

[–] Wooki@lemmy.world 11 points 3 months ago* (last edited 3 months ago) (2 children)

You’re implying AI has the intelligence to remotely achieve this. It doesn’t. It is all venture capitalist porn for over glorified keyword copy paste. Thats it.

load more comments (2 replies)
[–] deadlyduplicate@lemmy.world 11 points 3 months ago (2 children)

Look up crisis theory, the rate of profit tends to fall in capitalist systems. Because each company is driven by competitive self-interest, it is incapable of acting for the good of the whole. You simply cannot devote resources to anything but trying to out-compete your rivals and in doing so the profit for everyone tends lower and lower until you have a crisis.

load more comments (2 replies)
[–] Drewelite@lemmynsfw.com 11 points 3 months ago

That's the cool part, you won't. If everything crucial is automated, people can drive things forward for passion rather than for money. Of course, this would effectively collapse capitalism, which won't happen painlessly.

load more comments
view more: next ›