A discussion with Quantropi Co-Founder and Chief Scientist Dr. Randy Kuang
Interview conducted by Quantropi CTO Michael Redding
The Chinese research paper “Factoring integers with sublinear resources on a superconducting quantum processor,” published in December 2022 by Bao Yan et al., has set off a storm of controversy in the cybersecurity community. It introduced SQIF (sublinear-resource quantum integer factorization) – a novel hybrid algorithm that is claimed to factor large numbers with a fraction of the resource requirements of Shor’s algorithm. Shor’s algorithm has so far been able to factor the integer 21, while SQIF is claimed to have factored the 48-bit integer 261980999226229 with just ten physical qubits.
These findings have caused confusion and even panic, with some going so far as to say that SQIF spells the end of classical public-key cryptography.
In this conversation between quantum physicist and Quantropi Co-Founder Dr. Randy Kuang and CTO Michael Redding, Randy shares his thoughts on the potential ramifications of SQIF on classical cryptography, Y2Q, and the future of post-quantum cryptography.
Michael: Starting with perhaps the most worrying question for many, has SQIF broken classical public-key cryptography?
Dr. Kuang: No, the proposed SQIF algorithm has not broken classical public key cryptography.
The paper’s authors have merely demonstrated the ability of the algorithm to factor a 48-bit integer 261980999226229 with just ten qubits. For comparison, Shor’s algorithm has been able to factorize the integer 21 so far – significantly smaller than what SQIF is claimed to have achieved. This implies that SQIF might allow us to defeat classical PKC with much fewer resources than Shor’s algorithm would require. However, we still have a long way to go until we can break 2048-bit RSA. Current quantum computers don’t have enough qubits with good enough fidelity to challenge it.
Right now, the importance of SQIF might not be in its ability to challenge RSA-2048 but rather in its potential to deepen research into novel quantum and hybrid algorithms. Just a few weeks after the SQIF paper was published, a research team at Kipu Quantum in Germany announced an algorithm that is claimed to outperform SQIF when factoring a 48-bit integer with 10 qubits. The team claims to have internally achieved even further records using their new algorithm with a confirmation of the sublinear-resource or physical qubits.
Regardless of the effect of SQIF on classical cryptography, we can expect Y2Q to arrive all the same. SQIF might bring Y2Q closer, but we can’t really tell by how much. None of which changes the fact that the right time to start preparing for the quantum threat is still yesterday.
Michael: What makes this novel algorithm stand out from Shor’s algorithm? Why should we worry about it more?
Dr. Kuang: The major difference between Shor’s algorithm and SQIF is that Shor’s algorithm is based on perfect, fault-tolerant quantum computers, whereas the novel algorithm works with good-quality, imperfect physical qubits with high fidelity to allow the QAOA (quantum approximate optimization algorithm) quantum optimizer to find enough smooth relation pairs.
Why can SQIF use good-quality physical qubits rather than fault-tolerant ones? The reason is that SQIF uses the quantum variational principle. Shor’s algorithm, by contrast, is a generic pure quantum algorithm that requires more qubits and quantum circuit depth.
Now, if SQIF can factor integers with fewer quantum resources than Shor’s algorithm, does that mean it’s faster? No. I think there is some confusion over terminology when people compare SQIF with Shor’s algorithm. We think in terms of speed, but the two algorithms are too different to draw a direct comparison between them. If we already had fault-tolerant quantum computers, Shor’s algorithm might have actually turned out to be faster – that is, it would factor the same integer quicker than SQIF.
The major difference in performance between Shor’s algorithm and SQIF is that SQIF needs fewer and lower-quality quantum resources. In terms of speed, SQIF shows huge speedups compared with pure classical algorithms thanks to QAOA, but we can’t say whether it’s faster than Shor’s algorithm.
The speed difference between SQIF and Shor’s algorithm might not even matter now – I think we should worry not about speed but about capability. If we could use SQIF to break RSA-2048 in an achievable time frame, it wouldn’t even matter whether Shor’s algorithm were significantly faster than SQIF in absolute terms. SQIF might be a big threat because it might be able to factor much larger integers than Shor’s algorithm with fewer computational resources.
Michael: So, I think this is a major difference: Shor’s algorithm works for perfect quantum computers, while this algorithm works on lower-class machines.
Dr. Kuang: The paper indicates that RSA-2048 might be broken sooner with SQIF than with Shor’s algorithm because of the less than perfect qubits it expects to be able to use. This is serious and drives some of the panic – after all, today’s PKI is fully based on RSA.
If there’s one thing for sure, it’s that Y2Q will be coming sooner than we thought. We need to act sooner rather than later.
Michael: The paper refers to two resources necessary for the novel quantum algorithm: the number of qubits and the quantum circuit depth. How does the algorithm leverage these two components? When will we achieve those “372 physical qubits and a depth of thousands” to challenge RSA-2048?
Dr. Kuang: SQIF uses physical qubits, not fault-tolerant logical qubits. At first glance, you might say that we already have enough physical qubits – IBM’s Osprey with 433 physical qubits came out in 2022.
However, this isn’t enough to crack RSA-2048. We don’t just need physical qubits; we need physical qubits with good gate fidelity. IBM hasn’t revealed the fidelity of Osprey yet, but its gate fidelity is estimated to be about 99%-99.9%. To challenge RSA-2048 with the proposed algorithm, we need fidelity three orders higher – to be more precise, fidelity of 99.9999%. Four nines after the decimal point. This is equivalent to one error per 1,000,000+ gate operations.
IBM plans to reach fidelity of better than 99.9% in 2024. That’s just a thousand gate operations per error, whereas we need much more. Osprey has a lot of physical qubits, but they don’t have sufficient quality for SQIF.
The next question is the circuit depth. We need a depth of thousands, but what do we have today? What’s the maximum depth Osprey can support? We don’t know for sure, but it’s probably 10-100. So we need to go at least a factor of 10 deeper to achieve the circuit depth required to challenge RSA. Osprey’s not going to do it, which is why I think we shouldn’t panic just yet.
Michael: So those are the three things to look for – qubits, quantum circuit depth, and fidelity.
Dr. Kuang: There’s one more thing, however, which I confirmed with the paper’s authors. We can make a trade-off between the number of qubits and gate fidelity. More precisely, we can offset the fidelity requirement by using more physical qubits. If we go to 3,000 or 30,000 physical qubits, we might be able to reduce the fidelity requirement and use shallower circuits to attack RSA.
And to again compare Shor’s algorithm and SQIF, Shor requires 4,000 logical qubits or 4,000,000 noisy physical qubits, as well as a circuit depth of 8 billion to break RSA-2048. Osprey doesn’t even have one logical qubit, so we’re far from being able to use Shor’s algorithm to crack RSA. For SQIF, the authors estimate that we’ll need around 372 physical qubits and a depth of thousands to do the same. Even though we don’t yet have physical qubits with fidelities over 99.9999%, we’ll most likely be able to meet SQIF’s requirements for tackling RSA-2048 much sooner than Shor’s.
Michael: What’s the benefit of the classical plus quantum framework presented in the paper? Why not do everything in the quantum realm?
Dr. Kuang: We’ve long known that we cannot crack RSA with pure classical methods. There might not be enough time left in the universe to brute-force RSA on a classical machine. For pure quantum methods, there is plenty of work to be done as well – that might be as far as after 2030 when we have fault-tolerant quantum computers.
By combining classical and quantum, SQIF seems to attempt to address the resource limitations we have both in classical and quantum computers. The SQIF algorithm uses classical Schnorr’s (not Shor’s) algorithm in two steps to turn the integer factorization problem into a lattice-based CVP (closest vector problem) task. In the first step of the algorithm, the paper’s authors use Babai’s algorithm to obtain optimal vectors for the lattice. These optimal vectors may lead to the so-called smooth-relation pairs that might give us the integer factors we’re looking for.
In the second part of the algorithm, we need to solve the CVP linear equation system by finding enough of these sr-pairs. The longer the RSA cryptographic key, the more smooth-relation pairs we’ll need to find to solve the linear equation system. For RSA-48, we need to obtain over 201 sr-pairs, while for RSA-2048, the required number of sr-pairs increases to over 276,769.
In the classical realm, it takes a long time for Schnorr’s algorithm to find enough smooth-relation pairs for the RSA integers. Additionally, Schnorr’s algorithm is known to struggle with scalability. This is where the quantum optimizer, known as QAOA, comes into play. QAOA takes the second part of Schnorr’s algorithm from a mathematical into a physical realm where the energy spectrum of the qubits represents the potential short vectors for sr-pairs. Thanks to this property of QAOA, Schnorr’s algorithm isn’t only faster in the quantum scheme, but also possibly scalable. But because the amount of energy levels for the sr-pairs is limited in noisy physical qubits, we need quantum computers with a certain degree of fidelity and added circuit depth to achieve good results with the algorithm.
Michael: What’s the potential impact of SQIF on lattice-based PQC algorithms? Can SQIF potentially break PQC even before we can use it?
Dr. Kuang: At the very end, the paper presents how QAOA can be used as a subroutine for lattice reduction problems. That said, it hasn’t explicitly claimed that QAOA could be used to somehow threaten and defeat lattice-based PQC algorithms.
At this point, we can only speculate, and the scientific community will need to take a better look at QAOA to determine whether it can solve lattice reduction problems and, by extension, also threaten PQC.
I think this is a much bigger issue than the threat of SQIF to RSA. We’ve known for a long time (at least since Shor’s algorithm was introduced in the 90s) that RSA-2048 and the entire classical public-key infrastructure will fall victim to the quantum threat – the question is when.
However, when it comes to PQC, the ramifications of SQIF could be much more serious and far-reaching. Today, it seems that lattice-based PQC is heavily favored in the research community and is also the basis of some NIST standardization candidates. And because we’re looking to switch to post-quantum cryptography to protect ourselves from quantum attacks after Y2Q, the fact that we might have the key to PQC even before it’s rolled out is extremely alarming. Again, this is only speculation, but SQIF could threaten widely used classical algorithms and future quantum-secure systems that we’re looking to switch to.
This brings us to a bigger thesis: Shor’s algorithm and the likes of SQIF aren’t the only ways in the universe to break RSA-2048. As we better understand quantum physics and as companies like IBM deepen their research into quantum computers, we’ll keep coming up with more and more ways to solve mathematical problems. Most importantly, researchers will probably keep coming up with novel algorithms that can factor integers with fewer computational resources and in a shorter amount of time. We’re already seeing researchers attempt to provide alternatives and enhancements to SQIF, and we’ll most likely see even more work exploring innovative quantum algorithms.
Researchers may start exploring other avenues for attacking cryptography as well. Optimization could be promising in this area, as demonstrated by SQIF’s subroutine QAOA. Machine learning is one field where we may see serious developments – right now, the likes of Meta AI are experimenting with machine learning to discover vulnerabilities in PQC. Meta AI argues that lattice-based cryptography has weaknesses against machine learning models. Because complex neural networks can, in theory, approximate any function, they could potentially be successfully used for attacking PQC cryptosystems.
Ultimately, crypto agility will be the backbone of cybersecurity in the future. We can’t (and shouldn’t) overcommit to particular cybersecurity technologies, no matter how enticing or technologically advanced they seem, because researchers will always find new ways to push boundaries. This is the philosophy that Quantropi’s QiSpace™ platform is built on, and why it will include NIST finalists and proprietary quantum-secure systems that look at the quantum threat from a different angle.
Michael: Will Y2Q hit sooner than predicted? If so, by how much?
Dr. Kuang: I think Y2Q is definitely going to hit sooner, but we cannot tell by how much; it depends on the development of quantum computing, especially on improvements in fidelity.
There is still a long way to go until we get fidelities of over 99.9999% and quantum circuits with a depth of thousands. However, if we were to ask ourselves which would come sooner — 4,000,000 physical qubits and a depth of billions on the one hand, and 400 high-quality physical qubits with a depth of thousands on the other, we would, of course, choose the latter. The technical requirements of SQIF are far more achievable than those of Shor’s algorithm.
There’s also the matter of other research teams coming up with improved ways of factorizing integers. SQIF is probably just one of the many possible ways to factorize large integers. We will certainly see more research into SQIF and possibly even faster and more efficient algorithms, as demonstrated by the efforts of Kipu Quantum. The snowball effect from the SQIF paper will bring Y2Q closer and closer, which is yet another reason to start preparing for the quantum threat right now.
Michael: What are the main technical challenges of running SQIF? Is it even feasible in a technical sense?
Dr. Kuang: Based on the paper, there seem to be no major technical challenges to implementing SQIF in the real world. The authors have provided step-by-step explanations for the algorithm and three examples for researchers to explore. Similar to the work of the researchers from Kipu Quantum, other teams can try to repeat and enhance the results presented in the paper.
I think the main technical challenge for SQIF is the integers it can factor with current quantum computers. The paper claims that SQIF has factored the 48-bit integer 261980999226229 with ten qubits – it will be interesting to see what quantum machines like the 433-qubit Osprey and future larger computers will be able to do. Another interesting question is when quantum computers with enough circuit depth and fidelity to challenge RSA-2048 will appear.
Michael: Do you trust the results of the study, and do you believe it was well set up?
Dr. Kuang: The paper presents interesting ideas for addressing the integer factorization problem, possibly opening a door for a completely new class of hybrid classical plus quantum algorithms. The authors provide a detailed walkthrough for SQIF’s technical implementation, so other research teams can try to verify the validity of the paper’s claims. In this sense, the paper might encourage deeper research into algorithms that can break RSA-2048 sooner and even challenge lattice-based post-quantum cryptography. Researchers are already attempting to scrutinize SQIF and propose better solutions. I am also glad that the paper made more people aware of the quantum threat and made them consider the importance of Y2Q.
No matter what, for a scientific community spread throughout the world, collaboration is the key to enabling ground-breaking research that’ll bring us to better-understood, safer cybersecurity solutions.
About Dr. Randy Kuang
Randy holds a doctorate in quantum physics. His research findings have been published in top international journals and named “Kuang’s semi-classical formalism” by NASA in 2012. On the basis of a successful career in information technologies, including with Nortel as senior network researcher & developer, he co-founded inBay Technologies in 2009, serving as CTO of the cybersecurity platform. As the first recipient of a patent for two-level authentication (2011), Randy is a prolific inventor, with 30+ U.S. patents in broad technology fields, such as WiMAX, optical networks, multi-factor identity authentication, transaction authorization, as well as concepts, technologies, and industrial applications for quantum key distribution.
As the founding root of Quantropi, Randy proposed the universal quantum safe cryptography using Quantum Permutation Pad or QPP applied to both classical computing and quantum computing systems. The typical classical implementation of this achieved Digital Quantum Key Distribution over the Internet and was recently benchmarked by Deutsche Telekom. By randomizing and derandomizing the reference phase space of coherent communications, Randy invented coherent-based CTF-QKD and further extends it for quantum secure QXD for infrastructure. He recently invented new quantum safe public key cryptographic algorithms for key exchange and digital signature.