this post was submitted on 20 Feb 2024
210 points (94.9% liked)

News

22846 readers
4612 users here now

Welcome to the News community!

Rules:

1. Be civil


Attack the argument, not the person. No racism/sexism/bigotry. Good faith argumentation only. This includes accusing another user of being a bot or paid actor. Trolling is uncivil and is grounds for removal and/or a community ban.


2. All posts should contain a source (url) that is as reliable and unbiased as possible and must only contain one link.


Obvious right or left wing sources will be removed at the mods discretion. We have an actively updated blocklist, which you can see here: https://lemmy.world/post/2246130 if you feel like any website is missing, contact the mods. Supporting links can be added in comments or posted seperately but not to the post body.


3. No bots, spam or self-promotion.


Only approved bots, which follow the guidelines for bots set by the instance, are allowed.


4. Post titles should be the same as the article used as source.


Posts which titles don’t match the source won’t be removed, but the autoMod will notify you, and if your title misrepresents the original article, the post will be deleted. If the site changed their headline, the bot might still contact you, just ignore it, we won’t delete your post.


5. Only recent news is allowed.


Posts must be news from the most recent 30 days.


6. All posts must be news articles.


No opinion pieces, Listicles, editorials or celebrity gossip is allowed. All posts will be judged on a case-by-case basis.


7. No duplicate posts.


If a source you used was already posted by someone else, the autoMod will leave a message. Please remove your post if the autoMod is correct. If the post that matches your post is very old, we refer you to rule 5.


8. Misinformation is prohibited.


Misinformation / propaganda is strictly prohibited. Any comment or post containing or linking to misinformation will be removed. If you feel that your post has been removed in error, credible sources must be provided.


9. No link shorteners.


The auto mod will contact you if a link shortener is detected, please delete your post if they are right.


10. Don't copy entire article in your post body


For copyright reasons, you are not allowed to copy an entire article into your post body. This is an instance wide rule, that is strictly enforced in this community.

founded 1 year ago
MODERATORS
 

one assessment suggests that ChatGPT, the chatbot created by OpenAI in San Francisco, California, is already consuming the energy of 33,000 homes. It’s estimated that a search driven by generative AI uses four to five times the energy of a conventional web search. Within years, large AI systems are likely to need as much energy as entire nations.

top 50 comments
sorted by: hot top controversial new old
[–] FlyingSquid@lemmy.world 65 points 6 months ago (22 children)

And nobody seems to give a shit. Even people who would normally give a shit about this sort of thing. Even people who do things like denounce Bitcoin mining's waste of energy (and I agree) are not talking about the energy- and water- waste from AI systems.

That article says that OpenAI uses 6% of Des Moines' water.

Meanwhile-

According to Colorado State University research, nearly half of the 204 freshwater basins they studied in the United States may not be able to meet the monthly water demand by 2071.

https://abcnews.go.com/US/parts-america-water-crisis/story?id=98484121

And nobody seems to give a shit.

[–] Nudding@lemmy.world 26 points 6 months ago (41 children)

Lots of people give a shit, they're just not in any sort of position to do anything about it.

We won't treat climate change seriously until we get a significant climate related mass casualty event in North America.

[–] NarrativeBear@lemmy.world 13 points 6 months ago (4 children)

As soon as the gulf streams collapse I think a few more of us may start giving a shit.

[–] FlyingSquid@lemmy.world 16 points 6 months ago (2 children)

Giving a shit about the horse barn after someone's already let out all the horses doesn't really make a difference.

[–] agent_flounder@lemmy.world 9 points 6 months ago (1 children)
[–] FlyingSquid@lemmy.world 6 points 6 months ago

Yep. We suck. I'm reminded of it every day.

[–] magnetosphere@kbin.social 3 points 6 months ago

I think that’s their point

[–] tal@lemmy.today 9 points 6 months ago (1 children)

gulf streams collapse

Nah, there were some people worried about it, but it won't happen.

https://en.wikipedia.org/wiki/Gulf_Stream

The possibility of a Gulf Stream collapse has been covered by some news publications.[vague] The IPCC Sixth Assessment Report addressed this issue specifically, and found that based on model projections and theoretical understanding, the Gulf Stream will not shut down in a warming climate. While the Gulf Stream is expected to slow down as the Atlantic Meridional Overturning Circulation (AMOC) weakens, it will not collapse, even if the AMOC were to collapse. Nevertheless, this slowing down will have significant effects, including a rise in sea level along the North American coast, reduced precipitation in the midlatitudes, changing patterns of strong precipitation around Europe and the tropics, and stronger storms in the North Atlantic.

[–] NarrativeBear@lemmy.world 4 points 6 months ago

So what you're saying is that it's fine?

1000008187

[–] Theprogressivist@lemmy.world 5 points 6 months ago (1 children)

At the point it's too late.

load more comments (1 replies)
[–] dangblingus@lemmy.dbzer0.com 3 points 6 months ago

*The AMOC. The Gulf Stream can't really collapse.

load more comments (40 replies)
[–] bleistift2@feddit.de 5 points 6 months ago (1 children)

I guess it depends on how you use chatbots. If you’re just too lazy to click on the first google result you get, it’s wasteful to bother ChatGPT with your question. On the other hand, for complex topics, a single answer may save you quite a lot of googling and following links.

[–] FlyingSquid@lemmy.world 8 points 6 months ago (2 children)

Oh, well as long as it save you from Googling it's okay that it's a massive ecological disaster. My mistake.

[–] FaceDeer@kbin.social 5 points 6 months ago (22 children)

That's the opposite of what he said. That sort of usage isn't what ChatGPT is good for, it's best to use it for other kinds of things.

load more comments (22 replies)
[–] brbposting@sh.itjust.works 2 points 6 months ago

I mean an argument could be made here, right? Just thinking theoretically.

Maxim: we want to be as eco-friendly as possible.

Per a given task, understand the least environmentally-taxing way to accomplish the goal.

Task requires one, two, or three/four DuckDuckGo searches? DDG away.

Task requires five DDG searches, OR one LLM query? Language model it is.

(LLM may well rarely be the answer there, of course, just laying out the theory!)

[–] cm0002@lemmy.world 4 points 6 months ago* (last edited 6 months ago) (3 children)

Bitcoin was wasteful with little benefit, but AI has the potential to benefit humanity at large. Maybe ChatGPT itself isn't a great example of that, but their research has gone on to spur lots of advancements in AI, advancement that have allowed AI to make all sorts of breakthroughs in areas like medicine

[–] MrMcGasion@lemmy.world 5 points 6 months ago

Yeah, but LLMs like ChatGPT and the like aren't where that advancement is being made. LLMs are driving investment in the technology, but it's just a mostly useless investor target that just happens to run on the same hardware that can be used for useful AI-powered research. Sure, it's pushing the hardware advancement forward maybe 10-15 years faster than it might have otherwise happened, but it's coming with a lot of wasteful baggage as well because LLMs are the golden boy investors want to to throw money at.

[–] agent_flounder@lemmy.world 2 points 6 months ago

True the benefit actually exists here (how much is open for debate)

On the other hand, we should be doing full alarm bells and running around in a panic ramping down every use of energy possible before we leave our 100 surviving progeny a lifeless rock to live on. But humans don't work that way. By the time we are all on board it will be 100 years too late, unfortunately.

load more comments (1 replies)
load more comments (19 replies)
[–] drdabbles@lemmy.world 14 points 6 months ago

It's not secret, people just don't care. Manufacturers publish power and cooling data on spec sheets, but because people are easily wowed by pure garbage masquerading as breakthroughs and "future", they simply ignore the costs and push ahead. Add in the fact that most "AI" startups are actual scams, and you've got a corporate incentive to pretend this isn't doing permanent damage too.

[–] sativacat@lemmy.world 12 points 6 months ago (1 children)

They're scared of skynet but not global warming

[–] cm0002@lemmy.world 10 points 6 months ago (4 children)

Within years, large AI systems are likely to need as much energy as entire nations.

That doesn't sound like they're taking future hardware optimizations into account, we won't be using GPUs for this purpose forever (as much as Nvidia would like that to be true lol)

[–] FaceDeer@kbin.social 4 points 6 months ago

Not to mention that increasing usage of AI means AI is producing more useful work in the process, too.

The people running these AIs are paying for the electricity they're using. If the AI isn't doing enough work to make it worth that expense they wouldn't be running them. If the general goal is "reduce electricity usage" then there's no need to target AI, or any other specific use for that matter. Just make electricity in general cost more, and usage will go down. It's basic market forces.

I suspect that most people raging about AIs wouldn't want their energy bill to shoot up, though. They want everyone else to pay for their preferences.

[–] drdabbles@lemmy.world 3 points 6 months ago

Anything power saved by hardware design improvements will be consumed by adding more transistors. You will not be seeing a power consumption decrease. Manufacturers of this hardware have been giving talks for the past two years calling for literal power plants to be build co-resident with datacenters.

[–] itsJoelle@lemmy.world 2 points 6 months ago (7 children)

That was my thought too. I heard a take about how we may see us shift away from GPUs to purpose built PUs as a way to continue process progress now we’re getting pretty small on the silicon scale. Neural nets may be one of these special “PU”s we see.

load more comments (7 replies)
load more comments (1 replies)
[–] geissi@feddit.de 6 points 6 months ago (1 children)

Tbf, talking about the environmental costs of generative AI is just framing.
The issue is the environmental cost of electricity, no matter what it is used for.
If we want this to be considered in consumption then it needs to be part of the electricity price. And of course all other power sources, like combustion motors, need to also price in external costs.

[–] HurlingDurling@lemmy.world 2 points 6 months ago (1 children)

It should be considered, an ultra-light laptop, or a raspberry pi consume WAY less power than a full on gaming rig, and the same can be said between a data server that is used for e-commerce and a server running AI, the AI server has higher power requirements (it may not be as wide of a margin from my first comparison, but there is one), now multiply that AI server and add hundreds more and you start seeing a considerable uptick in power usage.

load more comments (1 replies)
[–] dhork@lemmy.world 2 points 6 months ago

Just wait until the AI finds out about Bitcoin....

load more comments
view more: next ›