Single-core PC breaks post-quantum encryption candidate algorithm in an hour

Researchers from the Computer Security and Industrial Cryptography (CSIS) group at KU Leuven have succeeded in breaking through (opens in a new tab) one of the late-stage candidate algorithms for post-quantum encryption. The algorithm, SIKE (short for Supersingular Isogeny Key Encapsulation (opens in a new tab)), passed most stages of the U.S. Department of Commerce’s National Institute of Standards and Technology (NIST) competition to define standardized post-quantum algorithms (opens in a new tab) against the threat posed by quantum computers – which will render current encryption schemes obsolete.

The researchers approached the problem from a purely mathematical perspective, attacking the heart of the algorithm design instead of any potential code vulnerabilities.

For mathematicians, researchers were able to crush SIKE by attacking its mathematical basis of encryption, Supersingular Isogeny Diffie-Hellman (SIDH). They showed (opens in a new tab) that SIDH is vulnerable to the “glue-and-split” theorem, developed by mathematician Ernst Kani in 1997, with additional mathematical tools devised in 2000. Still placed squarely in the mathematical arena, the attack uses genus two curves to attack elliptical curves (curves of genus 1). According to SIKE co-inventor David Jao, a professor at the University of Waterloo, “the newly discovered weakness is clearly a blow to SIKE” – which it is. But he added that it all goes back to cryptographers’ sometimes flawed dominance over pure mathematics, as the approach taken by the researchers was “really unexpected”.

For the rest of us, this all means the researchers used math to figure out SIKE’s encryption scheme and were able to predict – and then recover – its encryption keys.

For their problems and the document titled An Efficient Key Recovery Attack on SIDH (Preliminary Version) (opens in a new tab)researchers received $50,000 bounty sponsored by Microsoft (opens in a new tab).

SIKE rose to prominence after making it one of four additional candidates being considered by NIST after the organization officially announced the four algorithms last month that will replace the RSA, Diffie-Hellman and Diffie-Hellman algorithms at elliptical curve currently used. Much of global cybersecurity relies on these algorithms, which have been invaluable in securing information from bad actors – when not compromised in other ways. But they have a fundamental problem: they will be easily outdated once quantum computers have evolved enough. And it’s no longer a question of “if”: it’s a question of “when”.

“When” is expected to happen within this decade, so there is a real race to harden future computer systems and update the encryption schemes applied to current information.

To put the big problem into perspective, consider this: the current estimate is that humanity will have produced and stored 64 zettabytes (opens in a new tab) (1,000 bytes to the seventh power) of data by 2020. Almost none of this information is currently quantum-resistant. But, for now, we are somewhat insulated from a cascade of quantum hacks by the complexity of the technology – only the most remarkable societies and successful states have the brains, human and monetary capital for these systems.

But the cost will decrease. Solutions close to the “standard” will eventually appear. And when they do, any data secured with classic algorithms will have protection as good as the skin of an apple to keep you from sinking your teeth into its heart.

Yet it seems that all it takes to crack specific cutting-edge designs – namely, SIKE – is a single-core computer and an exotic application of high-level math. So this begs the question: are we ready to create standards for a booming computing field, where new approaches and breakthroughs are announced daily?

As Jonathan Katz, an IEEE Fellow and professor in the Department of Computer Science at the University of Maryland, wrote in an email to Ars Technica, “It is perhaps a bit concerning that this is the second instance in the past six months of a scheme that made it to the 3rd round of the NIST review process before being completely broken using a classical algorithm. Rainbow, the candidate he refers to, was busted in February of this year, but only at a theoretical level. He continued: “Three of the four PQC schemes are based on relatively new assumptions whose exact difficulty is not well understood, so what the latest attack indicates is that we may still need to be cautious/conservative with the normalization process in the future.”

Whether the time has come to standardize or not, one thing is certain: intense work in post-quantum cryptography is necessary. Just think of the systemic damage that a single successful attack on a mid-sized bank—zeroing all of its information, for example—would have on its very human customers and on the financial system as a whole. We’re talking about everything from single-parent families to 401K savings accounts to small and not-so-small businesses.

There are a few issues as important as this. First, its cryptographic and mathematically sound exploration is paramount.

Comments are closed.