this post was submitted on 05 Jul 2023
29 points (100.0% liked)

Technology

20 readers
4 users here now

This magazine is dedicated to discussions on the latest developments, trends, and innovations in the world of technology. Whether you are a tech enthusiast, a developer, or simply curious about the latest gadgets and software, this is the place for you. Here you can share your knowledge, ask questions, and engage in discussions on topics such as artificial intelligence, robotics, cloud computing, cybersecurity, and more. From the impact of technology on society to the ethical considerations of new technologies, this category covers a wide range of topics related to technology. Join the conversation and let's explore the ever-evolving world of technology together!

founded 2 years ago
 

A new quantum computer can execute calculations in mere moments that would take the most advanced supercomputers 47 years to process.

you are viewing a single comment's thread
view the rest of the comments
[–] ZickZack@kbin.social 4 points 1 year ago

For example, if you had an 8-bit integer represented by a bunch of qbits in a superposition of states, it would have every possible value from 0-256 and could be computed with as though it were every possible value at once until it is observed, the probability wave collapses, and a finite value emerges. Is this not the case?

Not really, or at least it's not a good way of thinking about it. Imagine it more like rigging coin tosses: You don't have every single configuration at the same time, but rather you have a joint probability over all bits which get altered to produce certain useful distributions.
To get something out, you then make a measurement that returns the correct result with a certain probability (i.e. it's a probabilistic turing machine rather than a nondeterministic one).

This can be very useful since sampling from a distribution can sometimes be much nicer than actually solving a problem (e.g. you replace a solver with a simulator of the output).
In traditional computing this can also be done but that gives you the fundamental problem of sampling from very complex probability distributions which involves approximating usually intractable integrals.

However, there are also massive limitations to the type of things a quantum computer can model in this way since quantum theory is inherently linear (i.e. no climate modelling regardless of how often people claim they want to do it).
There's also the question of how many things exist where it is more efficient to build such a distribution and sample from it, rather than having a direct solver.
If you look at the classic quantum algorithms (e.g. https://en.wikipedia.org/wiki/Quantum_algorithm), you can see that there aren't really that many algorithms out there (this is of course not an exhaustive list but it gives a pretty good overview) where it makes sense to use quantum computing and pretty much all of them are asymptotically barely faster or the same speed as classical ones and most of them rely on the fact that the problem you are looking at is a black-box one.

Remember that one of the largest useful problems that was ever solved on a quantum computer up until now was factoring the number 21 with a specialised version of Shor's algorithm that only works for that number (since the full shor would need many orders of magnitude more qbits than exist on the entire planet).

There's also the problem of logical vs physical qbits: In computer science we like to work with "perfect" qbits that are mathematically ideal, i.e. are completely noise free. However, physical qbits are really fragile and attenuate to pretty much anything and everything, which adds a lot of noise into the system. This problem also gets worse the larger you scale your system.

The latter is a fundamental problem: the entire clue of quantum computers is that you can combine random states to "virtually" build a complex distribution before you sample from it. This can be much faster since the virtual model can look dependencies that are intractable to work with on a classical system, but that dependency monster also means that any noise in the system is going to negatively affect everything else as you scale up to more qbits.
That's why people expect real quantum computers to have many orders of magnitude more qbits than you would theoretically need.

It also means that you cannot trivially scale up a physical quantum algorithm: Physical grovers on a list with 10 entries might look very different than a physical grover with 11 entries.
This makes quantum computing a nonstarter for many problems where you cannot pay the time it takes to engineer a custom solution.
And even worse: you cannot even test whether your fancy new algorithm works in a simulator, since the stuff you are trying to simulate is specifically the intractable quantum noise (something which, ironically, a quantum computer is excellent at simulating).

In general you should be really careful when looking at quantum computing articles, since it's very easy to build some weird distribution that is basically impossible for a normal computer to work with, but that doesn't mean it's something practical e.g. just starting the quantum computer, "boop" one bit, then waiting for 3ns will give you a quantum noise distribution that is intractable to simulate with a computer (same thing is true if you don't do anything with a computer: there's literal research teams of top scientists whose job boils down to "what are quantum computers computing if we don't give them instructions").

Meanwhile, the progress of classical or e.g. hybrid analog computing is much faster than that of quantum computing, which means that the only people really deeply invested into quantum computing are the ones that cannot afford to miss, just in case there is in fact something:

  • finance
  • defence
  • security
  • ....