Yes, there are people who don't agree with your consensus, that is part of my point, the consensus is not all that universal. And the other part of my point is that if you only count people who have a full time job developing quantum computers as experts then you get a massive bias in opinion, as the sceptics are more likely to pick different jobs.
As for my own belief, I don't know how it will pan out, but I'm inherently sceptical of people who are too certain that it is one way or another, the hard evidence simply does not exist yet.
I'm personally most inclined to believe that the universe is computationally classical, and while quantum physics necessitates that there must be quite a lot more information and processing happening than a Newtonian model requires, it is still a finite, thus putting a finite limit on what speed-up we can gain from a quantum computer. Most likely this pans out as the error correction limit being impossible to reach, but there is also the possibility that more complicated quantum states result in higher per-gate error rates.
I'm open to other conjectures, but if you want to dispel mine I would like to see some hard evidence.
Boson sampling _works_. It's now on the error-correction skeptics to provide a framework where that remains true, but stops being true when a particular combination of gates is applied, because the “naive” math says it should work fine. It would require a new fundamental physical principle to be true, for a quantum system to be sensitive to the nature of the computation involved in generating the distribution being sampled from.
At toy scale, yes, just like quantum computers. And since we haven't seen any news of working boson sampling with thousands of photons I assume that it stops working when you try to scale up, just like quantum computers.
New fundamental physical principles are at this stage required in any case, quantum mechanics and general relativity don't fit together, meaning that at least one of them need to be overhauled into something that explains the same observations but doesn't have quite the same maths. The lacklustre performance of quantum computers could be a hint to what needs overhauling.
As for my own belief, I don't know how it will pan out, but I'm inherently sceptical of people who are too certain that it is one way or another, the hard evidence simply does not exist yet.
I'm personally most inclined to believe that the universe is computationally classical, and while quantum physics necessitates that there must be quite a lot more information and processing happening than a Newtonian model requires, it is still a finite, thus putting a finite limit on what speed-up we can gain from a quantum computer. Most likely this pans out as the error correction limit being impossible to reach, but there is also the possibility that more complicated quantum states result in higher per-gate error rates.
I'm open to other conjectures, but if you want to dispel mine I would like to see some hard evidence.