Back in 1999, everybody caught the ‘Y2K’ bug. According to this ‘prophecy of doom’, the transition into a new millennium would wreak havoc in computer networks globally and ultimately bring our entire civilization to a grinding halt.
Y2K turned out to be a damp squib.
But it’s highly unlikely that Y2Q or ‘years to quantum’ would be one as well. Y2Q is approaching – and fast!
Experts like Michele Mosca, Deputy Director at the University of Waterloo’s Institute for Quantum Computing believe that the odds of reaching Y2Q by 2026 are 1 in 7 and 1 in 2 by 2031. When Y2Q becomes reality, quantum computers will easily break the outdated cryptographic protocols we currently rely on to protect our systems and data. That’s assets worth US$3.5 trillion that are at risk because they still rely on outdated cryptography!
Currently, the best way to ward off a possible future quantum attack is to develop stronger quantum-resistant encryption (aka post-quantum cryptography or PQC). But the truth is, most PQC methods work well only in the lab. In unpredictable real-world environments, they just cannot stand up to scrutiny. Moreover, researchers at the University of Waterloo’s erstwhile Quantum Hacking Lab have demonstrated that theoretically-perfect PQC is not as ‘unhackable’ or ‘quantum-proof’ as its supporters claim.
Here are the 3 drawbacks of PQC-based systems we need to be aware of. $3.5 trillion worth of assets are at stake!
- Increased transition complexity
Moving to a PQC-based system will affect the performance of an organization’s current cryptographic infrastructure since it will involve more computations and therefore an increased workload. It may even render some parts of the system obsolete, raising the need for replacement hardware and adding to transition complexity. As system complexity increases, it will also increase costs and lengthen timelines.
As an organization starts thinking about the move from classical to PQC-based encryption, it cannot ignore these disadvantages. Meanwhile, its vulnerability to quantum attacks keeps increasing.
- Difficult to scale
Many PQC algorithms are notoriously difficult to scale. For example, for lattice-based cryptography, which is a popular method for post-quantum cryptography, it is very difficult to prove its ‘hardness’ (a measure of an algorithm’s resilience to attacks) at scale. Current lattice algorithms that do manage to scale well only achieve average-case hardness. Thus, there is a trade-off between hardness and scalability. Either can be achieved, but not both.
- Larger key sizes and limited speeds
Most PQC algorithms require much larger key sizes than existing public key algorithms. For example, multivariate cryptography, which is also considered a good basis for PQC, involves very large key sizes, which require more storage inside a device. They also result in large amounts of data to be sent over a communications system for key establishment and signatures. Therefore more time is required to encrypt and decrypt messages, or to verify signatures at either end. This limits transfer speeds, which can be dangerous in case of a sudden quantum attack.
Quantropi is prepared today to secure systems against tomorrow’s quantum attacks. Our end-to-end solution is fast, efficient and most importantly, quantum secure. It consists of i) a Galois Public Key (GPK) Envelope, ii) a CipherSpace™ quantum gate, and iii) our proprietary Quantum Entropy Expansion and Propagation (QEEP™) technology. This robust, enterprise-ready solution can be implemented on existing infrastructures, which ensures low transition complexity, short timelines and affordable costs. It can easily scale as your organization grows without affecting transfer speeds, business goals or organizational throughput.
Get started with an ultra-fast, high-entropy and energy-efficient quantum-safe solution that will protect your information and your assets, today, tomorrow and always. Contact us to know more!