There's not a lot of information given here. The blog post is better [1].
Long story short, they decided to augment Wire with experimental post-quantum security using the NewHope key exchange scheme.
You can read more about NewHope here [2]. It's a lattice-based cryptosystem using the Ring-Learning With Errors (R-LWE) problem. It's also the same post-quantum key exchange scheme Google experimented with in Google Chrome [3]. R-LWE is pretty common in state of the art lattice-based cryptosystems (there are several such, including NewHope, in Round 1 of the NIST PQCrypto CFP [4]).
Among the mathematical "tribes" of post-quantum cryptography, lattice-based (and code-based) problems are particularly good for speed. On the other hand, their key sizes are significantly larger (this phenomenon is somewhat inverted in supersingular isogenies, which offer fantastic key sizes but much slower key exchange). For those interested in learning more about the learning with errors problem (and its ring-augmented cousin), the first few pages of the NewHope specification (and most lattice-based specs from NIST PQCrypto) are a good brief [5]. And while it's not related to NewHope specifically, Peikert's survey on lattice-based cryptography is relatively recent and accessible [6].
I'm interested in what impact this will have on latency in Wire. In the context of the Google Chrome TLS experiment, the median connection latency increased by 1ms, the slowest 5% increased by 20ms and the slowest 1% increased by 150ms [7]. The increased connection latency was attributed to the increase in message size, which in my opinion is pretty interesting in consideration of the fact that we (generally) consider key size and operation speed separately.
Those are physical qubits, not logical qubits. The blog post from Google [1] is a better (but still marketing-drenched) review. There's no research paper to read and no indication of what the precise error correction is on those 72 physical qubits. The best information we have to go on is that 1) this is a scaled up version of Google's 9-qubit quantum computer, and 2) the 9-qubit quantum computer had an error rate of 0.6%.
But in the absence of any peer review or even published material, this may as well not exist. Assuming it does exist, Google is cautiously optimistic that it will be able to demonstrate quantum supremacy. But that doesn't actually mean it will be able to do anything meaningfully useful for real world applications, it just means it will breach the threshold where it's demonstrably faster than a classical computer for something (instead of theoretically).
Once you leave the cleanroom it will be much less impressive. No one is close to developing a quantum computer capable of useful cryptanalysis. In fact, I'd confidently wager no one is even close to a breakthrough that would bring them close to the real breakthrough we need for quantum computers to break e.g. RSA. We'd need hundreds of thousands of physical qubits just to achieve the error correction requisite to break 2048-bit RSA. I'm weakly pessimistic I'll live to see it.
The noise issue does seem problematic. Especially once you're beyond what's doable classically. I suppose that you could just check for reproducibility. But maybe there could be systematic problems that wouldn't show up doing that.
But for cracking encryption, the end results seems pretty clear. You either get sensible plaintext, or you don't. And you can try multiple times. Systematic effects that consistently yield incorrect plaintext seem unlikely.
It’s not always obvious whether you’ve gotten sensible plaintext, especially if the plaintext is a binary file format you weren’t expecting, or if the plaintext has been encrypted multiple times.
Long story short, they decided to augment Wire with experimental post-quantum security using the NewHope key exchange scheme.
You can read more about NewHope here [2]. It's a lattice-based cryptosystem using the Ring-Learning With Errors (R-LWE) problem. It's also the same post-quantum key exchange scheme Google experimented with in Google Chrome [3]. R-LWE is pretty common in state of the art lattice-based cryptosystems (there are several such, including NewHope, in Round 1 of the NIST PQCrypto CFP [4]).
Among the mathematical "tribes" of post-quantum cryptography, lattice-based (and code-based) problems are particularly good for speed. On the other hand, their key sizes are significantly larger (this phenomenon is somewhat inverted in supersingular isogenies, which offer fantastic key sizes but much slower key exchange). For those interested in learning more about the learning with errors problem (and its ring-augmented cousin), the first few pages of the NewHope specification (and most lattice-based specs from NIST PQCrypto) are a good brief [5]. And while it's not related to NewHope specifically, Peikert's survey on lattice-based cryptography is relatively recent and accessible [6].
I'm interested in what impact this will have on latency in Wire. In the context of the Google Chrome TLS experiment, the median connection latency increased by 1ms, the slowest 5% increased by 20ms and the slowest 1% increased by 150ms [7]. The increased connection latency was attributed to the increase in message size, which in my opinion is pretty interesting in consideration of the fact that we (generally) consider key size and operation speed separately.
______________
1. https://blog.wire.com/blog/post-quantum-resistance-wire
2. https://newhopecrypto.org
3. https://security.googleblog.com/2016/07/experimenting-with-p...
4. https://csrc.nist.gov/Projects/Post-Quantum-Cryptography/Rou...
5. https://newhopecrypto.org/data/NewHope_2017_12_21.pdf
6. https://web.eecs.umich.edu/~cpeikert/pubs/lattice-survey.pdf
7. https://www.imperialviolet.org/2016/11/28/cecpq1.html