The pace quickens for cryptography

4 mins read

Simply swapping traditional encryption for new fixed systems may not be enough to deal with tomorrow’s security issues.

Credit: Madi - adobe.stock.com

Mathematician Peter Shor explained in his talk at the 35th Annual Symposium on the Foundations of Computer Science in late 1994 how quantum computers could find the prime factors of huge numbers far faster than binary computers and could break the critical assumption that lies behind the mainstays of public-key infrastructure (PKI) and other systems: that factoring large primes is hard and time consuming.

That threat remains theoretical as major doubts surround whether practical quantum computers will overcome the problems of noise and random errors creeping into calculations to where the algorithm presented by Shor fails, even if quantum computers grow large enough to handle encryption keys.

Breakthroughs are possible and a major thrust in quantum-computer design lies in finding efficient error-correction algorithms.

Quantum-computer hardware has pushed past 1000 physical qubits and IBM hopes to get to 100,000 in a decade. But error correction on those systems means using 20 or more of those to implement a single, usable qubit. Factoring 2048bit RSA keys probably needs around 4000 qubits, which points to Q-Day, the time when classic encryption becomes properly vulnerable, by around 2030.

It is possible that Q-Day comes before a single quantum computer has enough qubits to handle a real-world cipher. Partial success might be enough for groups to work on data they have harvested. Research into the adjacent field of quantum annealing has shown that conventional computers can sometimes harness techniques derived from these novel techniques to solve optimisation problems. Quantum-computing software advances could speed up factorisation enough to make smaller keys based on RSA vulnerable to attack by a well-funded adversary. Researchers in China and Portugal have described such a hybrid algorithm that could use a network of conventional and quantum computers reducing the number of qubits needed by each machine.

Symmetric cryptography may also become vulnerable. With Grover’s algorithm running on a quantum computer, breaking symmetric ciphers, which is often used to encrypt data in flight, will not be as straightforward as for PKI but it points to a change in technology or an increase in key size to ward off attackers.

Agencies such as NIST in the US have been keen to avoid the IT industry getting caught out by a sudden jump in code-breaking technologies and published a shortlist of three replacements for RSA and elliptic-curve cryptography.

While researchers are increasingly confident that the new schemes are safe, there remain doubts over how implementation details will affect the protection they deliver. The common thread between the successful candidates is that they use lattice arithmetic, where points are arranged in a space of discrete points, where the task is to solve a set of linear equations in a matrix. Such linear algebra is easy. The encryption comes from mixing noise into the process to build a method known as learning with errors (LWE).

LWE relies on the difficulty of distinguishing between a noisy linear system and a truly random one. How the error distribution injects randomness into the system controls how secure the cryptographic scheme is. In principle, that is as difficult for a quantum computer as for a binary machine.

NIST sees a potential need for backup protocols that use techniques that do not rely on lattices to deal with the theoretical threat of a mathematical breakthrough that renders LWE and its relatives as vulnerable as RSA. There are always ways encryption can go wrong.

Above: CNSA 2.0 requirements through to 2033

Crypto-agility

Issues remain that are pretty much independent of whether quantum computing for cracking codes becomes practical. That understanding leads to larger changes, not just to the software needed to run tomorrow’s cryptosystems but also the hardware that winds up going into embedded processors and microcontrollers.

The experience with Secure Sockets Layer (SSL), used to secure internet traffic, shows how implementation, regardless of the underlying mathematics, could compromise security.

Side-channel analysis has proven remarkably effective at breaking many encryption algorithms simply by tracking subtle changes in power output or the time taken to process key bits depending on the values of those bits. Though masking execution of the processor with additional noise can deliver some protection, the best implementations are careful to ensure each branch of the algorithm does not change its power or timing profile depending on key values.

Many believe that the next generation of cryptosystems need to be quantum-safe but also agile to cope with rapid changes in vulnerability. “Crypto-agility” recognises that protocols may need to change after deployment, along with the software and hardware infrastructure that implements encryption, hashing and digital signatures. Though it is a big step, a move toward agility will probably help embedded-systems manufacturers migrate to PQC, particularly in situations where the key-size choices and processing overhead are difficult to support.

For example, a report by ETSI published last year found the NIST candidates for PQC pose problems for the protocols that self-driving vehicles will use to communicate with each other and the roadside network. The digital signatures become so big they dominate the radio capacity. Securely signing each transmission will be impractical.

One option where PQC is too onerous to implement is to use hybrid or composite systems, where a conventional system operates together with a low-overhead version of PQC. If one is cracked, the data could still gain protection from the other system for long enough to no longer be useful to an attacker. Developers may reserve full PQC for data that needs to be stored for long periods to counter “harvest now, decrypt later” (HNDL) attacks. In a crypto-agile framework, such hybrid systems offer a stepping stone to improve protection later on.

Agility will affect how manufacturers implement their crypto processors. Some accelerators look as though they can deliver reasonable PQC, if not the NIST-approved versions.

However, as direct support for modular arithmetic is not typically something that processor vendors put into their instruction sets because of specific requirements, the move to crypto-agility suggests a move towards a combination of programmable hardware and software implementation. An example is the addition of modular arithmetic to the vector instruction sets of some RISC-V processors.

In the meantime, there are major software process changes developers will need to make. To change encryption in an over-the-air update, you first need to know what you are changing. This leads to the idea of the cryptographic bill of materials (CBOM) where developers collect information on how their applications and libraries use different protocols. That information should make it easier to change the code in readiness for replacement.

There are a host of issues that still need to be addressed to deliver the resulting updates securely and cleanly as it implies root keys will also have to be changed, and many embedded processors use e-fuses or memory sectors locked at manufacturing time to help protect them. But the overall picture is one of encryption changing. And changing again.