Back in 1999, everybody caught the ‘Y2K’ bug. According to this ‘prophecy of doom’, the transition into a new millennium would wreak havoc in computer networks globally and ultimately bring our entire civilization to a grinding halt.
Y2K turned out to be a damp squib.
But it’s highly unlikely that Y2Q or ‘years to quantum’ would be one as well. Y2Q is approaching – and fast!
Experts like Michele Mosca, Deputy Director at the University of Waterloo’s Institute for Quantum Computing believe that the odds of reaching Y2Q by 2026 are 1 in 7 and 1 in 2 by 2031. When Y2Q becomes reality, quantum computers will easily break the outdated cryptographic protocols we currently rely on to protect our systems and data. That’s assets worth US$3.5 trillion that are at risk because they still rely on outdated cryptography!
Currently, the best way to ward off a possible future quantum attack is to develop stronger quantum-resistant encryption (aka post-quantum cryptography or PQC). But the truth is, most PQC methods work well only in the lab. In unpredictable real-world environments, they just cannot stand up to scrutiny. Moreover, researchers at the University of Waterloo’s erstwhile Quantum Hacking Lab have demonstrated that theoretically-perfect PQC is not as ‘unhackable’ or ‘quantum-proof’ as its supporters claim.
Here are the 3 drawbacks of PQC-based systems we need to be aware of. $3.5 trillion worth of assets are at stake!
Increased Transition Complexity
Moving to a PQC-based system will affect the performance of an organization’s current cryptographic infrastructure since it will involve more computations and therefore an increased workload. It may even render some parts of the system obsolete, raising the need for replacement hardware and adding to transition complexity. As system complexity increases, it will also increase costs and lengthen timelines.
As an organization starts thinking about the move from classical to PQC-based encryption, it cannot ignore these disadvantages. Meanwhile, its vulnerability to quantum attacks keeps increasing.
Difficult to Scale
Many PQC algorithms are notoriously difficult to scale. For example, for lattice-based cryptography, which is a popular method for post-quantum cryptography, it is very difficult to prove its ‘hardness’ (a measure of an algorithm’s resilience to attacks) at scale. Current lattice algorithms that do manage to scale well only achieve average-case hardness. Thus, there is a trade-off between hardness and scalability. Either can be achieved, but not both.
Larger Key Sizes & Limited Speeds
Most PQC algorithms require much larger key sizes than existing public key algorithms. For example, multivariate cryptography, which is also considered a good basis for PQC, involves very large key sizes, which require more storage inside a device. They also result in large amounts of data to be sent over a communications system for key establishment and signatures. Therefore more time is required to encrypt and decrypt messages, or to verify signatures at either end. This limits transfer speeds, which can be dangerous in case of a sudden quantum attack.
At Quantropi, we believe that every organization needs to harden today’s defences against today’s attacks AND tomorrow’s attacks by quantum computers. We’re the only cybersecurity company in the world providing the 3 prerequisites for cryptographic integrity: Trust, Uncertainty, and Entropy (TrUE). Powered by quantum mechanics expressed as linear algebra, our patented TrUE technologies establish Trust between any two parties via quantum-secure asymmetric MASQ™ encryption (coming soon); ensure Uncertainty to attackers, rendering data uninterpretable forever, with QEEP™ symmetric encryption; and provide Quantum Entropy as a Service (QEaaS) with SEQUR™ – ultra-random key generation and distribution to enable secure data communications. All Quantropi’s TrUE technologies are accessible via our flagship QiSpace™ platform.
Talk to us today!