this post was submitted on 01 Aug 2023
304 points (97.2% liked)
Technology
60073 readers
4357 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Not really. First, standard equipment is limited by cost, not technology. Nothing stopping some power user from using liquid nitrogen to cool a desktop, it's just costly. Superconductor tech, though, would be bleeding edge, it wouldn't cost any less for a long time. Supercomputing, on the other hand, has had access to more esoteric cooling systems, and can already use them. They also have had access to the extreme cold superconductors that have already existed.
The real issue there is the CPU makes the heat, but this tech isn't a transistor. We can't replace the silicon chips with superconducting ones, at least not in a form dense enough to be a CPU. There's lots of small improvements around the CPU we can make, but those aren't at the "wow, this will revolutionize technology" level. They're cool but it's the other stuff that's gonna get the focus.