Hacker News new | past | comments | ask | show | jobs | submit login

Is there any reason to believe a computer that can do Shor's algorithm is actually possible? There's a lot of freaking out trying to build QC-resistent assymetric encryption, but we don't actually know whether or not it's physically possible to build the requisite error corrections to make it feasible in the first place, right? Is there reason to believe that the physical reality of QM doesn't require exponentially more and more qbits for every logical qbit added to the system? Basically, is there any reason to believe that a QC capable of Shor's doesn't degrade to a space / time tradeoff for a classical computer when it comes to integer factorization?



The strong expert consensus is that fault-tolerant quantum computation is physically possible and it’s just a challenging engineering problem. One key insight is the threshold theorems, which show that if you can build individual physical gates with error rate below a finite threshold (typically 0.1-1%), you can just stack these gates together and they simulate logical operations with logical error rates that can be made exponentially small.

https://en.wikipedia.org/wiki/Threshold_theorem

If it helps, there are plenty of historical analogies. Quantum-enhanced sensing took more than half a century between conception and practical use, but it is currently used daily at LIGO.


The strong expert consensus has been reached between a lot of people who would be out of a job if their consensus was the opposite.


Not really accurate. There are tons of tenured profs who are well-positioned to reveal reasons why QCs are fundamentally infeasible (and those kind of stories play well in the media). You can read about Gil Kalai's arguments here:

https://www.quantamagazine.org/the-argument-against-quantum-...

In any case, I'm happy to bet on this.


Yes, there are people who don't agree with your consensus, that is part of my point, the consensus is not all that universal. And the other part of my point is that if you only count people who have a full time job developing quantum computers as experts then you get a massive bias in opinion, as the sceptics are more likely to pick different jobs.

As for my own belief, I don't know how it will pan out, but I'm inherently sceptical of people who are too certain that it is one way or another, the hard evidence simply does not exist yet.

I'm personally most inclined to believe that the universe is computationally classical, and while quantum physics necessitates that there must be quite a lot more information and processing happening than a Newtonian model requires, it is still a finite, thus putting a finite limit on what speed-up we can gain from a quantum computer. Most likely this pans out as the error correction limit being impossible to reach, but there is also the possibility that more complicated quantum states result in higher per-gate error rates.

I'm open to other conjectures, but if you want to dispel mine I would like to see some hard evidence.


Boson sampling _works_. It's now on the error-correction skeptics to provide a framework where that remains true, but stops being true when a particular combination of gates is applied, because the “naive” math says it should work fine. It would require a new fundamental physical principle to be true, for a quantum system to be sensitive to the nature of the computation involved in generating the distribution being sampled from.


At toy scale, yes, just like quantum computers. And since we haven't seen any news of working boson sampling with thousands of photons I assume that it stops working when you try to scale up, just like quantum computers.

New fundamental physical principles are at this stage required in any case, quantum mechanics and general relativity don't fit together, meaning that at least one of them need to be overhauled into something that explains the same observations but doesn't have quite the same maths. The lacklustre performance of quantum computers could be a hint to what needs overhauling.


What would be a statement that you would put, say, $100k on, that would be determined within 2040.

And which odds would you need on that?


The existence of a handful of logical qubits. Like I can initialize them, swap them, entangle them, measure them, etc., all performed with essentially perfect fidelity (say, 1e-10 error rate). I’d immediately bet $100k on that by 2040 at 50-50 odds, and probably 80-20 if I had a day or two to review the literature and confirm I wasn’t being adversely selected against.

If you wanted a useful machine, say factoring RSA-2048, you would need to push the date to more like 2050 or 2060.

Most of the uncertainty comes from the economy, engineering cost, public interest in QC, AI doom, etc. If we could somehow make a platonic bet on “is a large QC constructible if dedicated $10T/yr for 200 years”, then I am over 95-5.


With that kind of statement, it seems fundamentally immature to be deploying PQC on the Internet today when we're 20+ years out a useful machine. Prognostication this far into the future should not be the basis of making changes today - we're not good at it.


Disagree. I still put a few percentage points of probability on it happening much faster, maybe by 2030 or 2035. The whole internet and a good chunk of the global economy is predicated on secure encryption. A small chance of that being disrupted is worth substantial investment. Deploying PQC protocols for testing is cheap.

Not to mention the fact that we still don’t know if the current candidate PQC protocols are actually secure. Security is mostly a game of back-and-forth over years, so it could take a while.


The coordinated failures implied by a failure of QEC to scale would be new physics; the existing principles don't predict that anything like that should occur.

Which isn't to say that it's impossible, but it would be quite exciting if it were the case.


May not require new physics but may be arbitrarily expensive / impractical to build in reality, no?


Not really, no. If the breakpoint for fault tolerance is reached, then in the absence of new physics causing the noise that needs to be corrected to be adversarially coordinated, quantum computers are scaleable.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: