If these guys gave a shit they'd focus on light based chips, which are in very early stages, but will save a lot of power.
Technology
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
I'm surprised it's only 10x. Running a prompt though a llm takes quite a bit of energy, so I guess even the regular searches take more energy than I thought.
Same. I think I've read that a single GPT-4 instance runs on a 128 GPU cluster, and ChatGPT can still take something like 30s to finish a long response. A H100 GPU has a TDP of 700w. Hard to believe that uses only 10x more energy than a search that takes milliseconds.
I switched to Kagi like 6 months ago and I still love it. Almost never have to go back to google except for maps.
I wonder what the power consumption of getting to the information in the summary is as a whole when using a regular search, clicking on multiple links, finding the right information and extracting the relevant parts. Including the expenditures of energy by the human performing the task and everything that surrounds the activity.
There are real concerns surrounding AI, I wonder if this is truly one of them or if it’s just poorly researched ragebait.
All of this just to give rich shareholders even more money.
I would point out that Google has been "carbon neutral" with it's data centers for quite some time, unlike others who still rape the environment ahem AWS.