this post was submitted on 06 Oct 2024
11 points (100.0% liked)

TechTakes

1493 readers
120 users here now

Big brain tech dude got yet another clueless take over at HackerNews etc? Here's the place to vent. Orange site, VC foolishness, all welcome.

This is not debate club. Unless it’s amusing debate.

For actually-good tech, you want our NotAwfulTech community

founded 2 years ago
MODERATORS
 
you are viewing a single comment's thread
view the rest of the comments
[–] V0ldek@awful.systems 1 points 2 months ago (1 children)

Where's that person who was arguing with me last time that AI doesn't actually use that much energy and the corps missing their climate goals was not AI related

[–] skillissuer@discuss.tchncs.de 0 points 2 months ago (1 children)

how much it does use anyway? 5GWe was from delusional openai talk for investors, so maybe lower

[–] V0ldek@awful.systems 1 points 2 months ago (2 children)

That's the fucking problem, it's impossible to tell since MSFT won't tell you directly, and only the people who run the datacenters could.

The only relatively reliable numbers I was able to find were in this research paper by Luccioni and Strubell from ACM Conference on Fairness, Accountability, and Transparency 2024. Now, that's an obscure conference (not even ranked by CORE), by Dr. Luccioni appears to be right on the money about dangers of AI (https://www.sashaluccioni.com/).

[–] skillissuer@discuss.tchncs.de 1 points 2 months ago (1 children)

i started to look up satellite photos and openinframap in order to figure out maximum capacity of their substations, but powerlines for them are probably massively oversized, and substations are probably oversized too in order to make it redundant and high-availability so there might be some way to guess it but then some of these will be underground and if they're doing load-following to match their renewables (which might be cheaper for them) then it's also oversized a bit on top of that

[–] V0ldek@awful.systems 1 points 2 months ago (1 children)

Well the main problem is that a datacenter is running much more than just AI. You'd need to somehow subtract "normal" cloud usage from just the promptfondling.

[–] skillissuer@discuss.tchncs.de 3 points 2 months ago* (last edited 2 months ago)

ez. remember that announcement when ms said their energy use got up 36%? that's ai, and includes both training and use

this still can be fudged with more efficient office heating, shutdowns of least efficient dcs and so on, but only to a limited degree

[–] skillissuer@discuss.tchncs.de 1 points 2 months ago (1 children)

they will tell total tho https://www.latitudemedia.com/news/microsoft-reveals-the-energy-impact-of-artificial-intelligence

this works out to 2.7GW in 2023, on average. that's comparable to peak daily consumption in croatia (today), if that 30%-ish figure is accurate then something closer to 700MW is ai-only, that's smaller country like macedonia

which only highlights how bizarre is their 5GW proposition. hey let's outbuild ms 2x, like, now

[–] skillissuer@discuss.tchncs.de 1 points 2 months ago* (last edited 2 months ago)

that sounds like it's much less than crypto at its peak, and even 2023 estimate differs by over an order of magnitude (14.5GW avg). there's also google and fb and whoever else (aws?)