this post was submitted on 04 Dec 2023
41 points (95.6% liked)

Technology

34830 readers
148 users here now

This is the official technology community of Lemmy.ml for all news related to creation and use of technology, and to facilitate civil, meaningful discussion around it.


Ask in DM before posting product reviews or ads. All such posts otherwise are subject to removal.


Rules:

1: All Lemmy rules apply

2: Do not post low effort posts

3: NEVER post naziped*gore stuff

4: Always post article URLs or their archived version URLs as sources, NOT screenshots. Help the blind users.

5: personal rants of Big Tech CEOs like Elon Musk are unwelcome (does not include posts about their companies affecting wide range of people)

6: no advertisement posts unless verified as legitimate and non-exploitative/non-consumerist

7: crypto related posts, unless essential, are disallowed

founded 5 years ago
MODERATORS
 

A newly discovered trade-off in the way time-keeping devices operate on a fundamental level could set a hard limit on the performance of large-scale quantum computers, according to researchers from the Vienna University of Technology.

top 2 comments
sorted by: hot top controversial new old
[–] aodhsishaj@lemmy.world 13 points 11 months ago

TL: DR Planck time is hard.

[–] adespoton@lemmy.ca 2 points 11 months ago

This is a really interesting point; I tried flipping it on its head and the reasoning became even more obvious:

My thought was: “surely we can take advantage of relativistic effects to keep time at a slower pace locally but have it take a short enough time in the referent timeframe.” But in this case, there is a very obvious floor we’re working with: absolute zero. Because making things go relatively faster means making the other things go comparatively slower, and 0 is as slow as you can go. If subatomic particles have no movement, there’s nothing to measure, literally.

As a result, there is a very specific bound on timekeeping measurements no matter how you try to finesse things, with the amount of energy required to make minor improvements ramping up exponentially as that floor is approached.

In order to get around this, we’d have to come up with a different way to do error correction and results measurement, and I’m not sure there is one.