this post was submitted on 02 Aug 2023
356 points (96.1% liked)

Technology

59135 readers
6622 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

The fact that this has been replicated is amazing!

you are viewing a single comment's thread
view the rest of the comments
[–] Adeptfuckup@lemmy.world 7 points 1 year ago (2 children)

Hitch your tits and pucker up. We’re entering a new age of industry. Much like the original Industrial Revolution, technology is going to advance at an extremely rapid pace. Fusion, quantum computing supremacy. Just… wow. How far off is general AI with this new room temperature superconductor?

[–] drdabbles@lemmy.world 9 points 1 year ago (1 children)

Fusion is no closer than ever before, and AGI is hilariously over hyped. Also no closer than ever before.

[–] Proweruser@feddit.de 2 points 1 year ago

And Fusion is pretty close to begin with. Commonwealth Fusion is well within their purpose time table so far. They don't need any new superconductors for their project.

[–] Yondoza@sh.itjust.works 3 points 1 year ago (3 children)

Stupid question probably - is computing power what is holding back general AI? I've not heard that.

[–] drdabbles@lemmy.world 14 points 1 year ago (1 children)

What's holding back AGI is a complete lack of progress toward anything like intelligence. What we have now isn't intelligent, it's multi-variable probability.

[–] JGrffn@lemmy.world 2 points 1 year ago (1 children)

It's not that it's not intelligent, it's that predictive language models are obviously just one piece of the puzzle, and we're going to need all the pieces to get to AGI. It's looking incredibly doable if we figured out how to make something that's dumb but sounds smarter than most of us already. We just need to connect it to other models that handle other things better.

[–] dorkian_gray@lemmy.world 6 points 1 year ago* (last edited 1 year ago) (2 children)

Oh god yes. This is going to be pretty simplified, but: The sheer compute required to run something like ChatGPT is mindboggling, we're talking thousands of A100 GPUs ($10k a piece, each one has 80GB of VRAM) networked together, and probably petabytes of SSD storage for the DB. Most neutral networks require a bunch of GPUs working in parallel because they need a lot of very fast memory to hold all the data they sift through, and a lot of parallel compute to sift through that data as quickly as possible. That's why GPUs are good for this - you can think of a CPU like a human, very versatile but there's only so much one person can do at a time. Meanwhile GPUs are like bug swarms, a bunch of much simpler brains, but specialized, and they "make it up on volume". It's only because of advances in computing power, specifically in the amount of compute cores and VRAM on GPU dies, that the current level of AI became possible. Try downloading GPT4All and compare free models that run on your machine to the performance of ChatGPT - you'll certainly see the speed difference, and if you ask the free ones for code or logic you'll see the performance difference too.

This is all to say that superconducting traces and transistors mean no heat is generated by dumping power through them, so you can fit them closer together - even right next to and on top of each other, doesn't matter, because they don't need to be cooled. And, because you lose no power to heat, it can all go to compute instead, so it'll be perfectly efficient. It'll bring down the cost of everything, but specifically computer components, and thus OpenAI will be able to bring more servers online to improve the capabilities of their models.

[–] MuThyme@lemmy.world 3 points 1 year ago (1 children)

There is still heat generated by the act of computation itself, unless you use something like reversible computing but I don't believe there's any current way to do that.

And even then, superconducting semiconductors are still going to be some ways off. We could have superconductors for the next decade in power transmission and still have virtually no changes to processesors. I don't doubt that we will eventually do something close to what you describe, but I'd say it's easily a long way off still. We'll probably only be seeing cheaper versions of things that already use superconductors, like MRI machines.

[–] dorkian_gray@lemmy.world 0 points 1 year ago* (last edited 1 year ago)

Edit: my first draft was harsher then it needed to be, sorry, long day.

First of all, nobody's saying this is going to happen overnight. Secondly, traditional computing systems generate heat due to electrical resistance and inefficiencies in semiconducting transistors; the process of computation does not inherently require the generation of heat, nor cause it through some other means than electrical resistance. It's not magic.

Superconduction and semiconduction are mutually exclusive - it's in the name. A semiconductor has resistance properties midway between a conductor and an insulator. A superconductor exhibits no electrical resistance at all. A material can be a superconductor in one "direction" and a semiconductor in another, or a semiconductor can be "warped" into being a superconductor, but you can't have electrons flowing in the same direction with some resistance and no resistance at the same time. There's either resistance, or there's not.

Finally, there is absolutely no reason that a transistor has to be made of a semiconducting material. They can be made of superconducting materials, and if they are then there's no reason they'd generate heat beyond manufacturing defects.

Yes, I'm talking about a perfectly superconducting system and I'm not allowing for inefficiencies where components interface or component imperfections resulting in some small amount of resistance that generates heat; that would be a manufacturing defect and isn't relevant. And of course this is all theoretical right now anyway; we don't even know for sure if this is actually a breakthrough yet (even if it's really beginning to look like it). We need to better understand the material and what applications it's suited to before we can make concrete predictions on what impacts it will have. But everything I suggest is grounded in the way computer hardware actually works.

[–] Yondoza@sh.itjust.works 1 points 1 year ago

Really appreciate the write up! I didn't know the computing power required!

Another stupid question (if you don't mind) - adding superconductors to GPUs doesn't really se like it would make a huge difference on the heat generation. Sure, some of the heat generated is through trace resistance, but the overwhelming majority is the switching losses of the transistors which will not be effected by superconductor technology. Are we assuming these superconductors will be able to replace semiconductors too? Where are these CPU/GPU efficiencies coming from?

[–] knotthatone@lemmy.world 3 points 1 year ago

Simply throwing computing power at the existing models won't get us general AI. It will let us develop bigger and more complex models, but there's no guarantee that'll get us closer to the real thing.