Archive for the ‘Quantum Computer’ Category

Quantum Key Distribution: Is it as secure as claimed and what can it offer the enterprise? – The Register

Feature Do the laws of physics trump mathematical complexity, or is Quantum Key Distribution (QKD) nothing more than 21st-century enterprise encryption snake oil? The number of QKD news headlines that have included unhackable, uncrackable or unbreakable could certainly lead you towards the former conclusion.

However, we at The Reg are unrelenting sceptics for our sins and take all such claims with a bulk-buy bag of Saxa. What this correspondent is not, however, is a physicist nor a mathematician, let alone a quantum cryptography expert. Thankfully, I know several people who are, so I asked them the difficult questions. Here's how those conversations went.

I can tell you what QKD isn't, and that's quantum cryptography. Instead, as the name suggests, it's just the part that deals with the exchange of encryption keys.

As defined by the creators of the first Quantum key distribution (QKD) protocol, (Bennett and Brassard, 1984) it is a method to solve the problem of the need to distribute secret keys among distant Alice and Bobs in order for cryptography to work. The way QKD solves this problem is by using quantum communication. "It relies on the fact that any attempt of an adversary to wiretap the communication would, by the laws of quantum mechanics, inevitably introduce disturbances which can be detected."

Quantum security expert, mathematician and security researcher Dr Mark Carney explains there "are a few fundamental requirements for QKD to work between Alice (A) and Bob (B), these being a quantum key exchange protocol to guarantee the key exchange has a level of security, a quantum and classical channel between A and B, and the relevant hardware and control software for A and B to enact the protocol we started with."

If you are the diagrammatical type, there's a nifty if nerdy explanatory one here.

It's kind of a given that, in and of themselves, quantum key exchange protocols are primarily very secure, as Dr Carney says most are derived from either BB84 (said QKD protocol of Bennett and Brassard, 1984) or E91 (Eckert, 1991) and sometimes a mixture of the two.

"They've had a lot of scrutiny, but they are generally considered to be solid protocols," Dr Carney says, "and when you see people claiming that 'quantum key exchange is totally secure and unhackable' there are a few things that are meant: that the key length is good (at least 256 bits), the protocol can detect someone eavesdropping on the quantum channel and the entropy of the system gives unpredictable keys, and the use of quantum states to encode these means they are tamper-evident."

So, if the protocol is accepted as secure, where do the snake oil claims enter the equation? According to Dr Carney, it's in the implementation where things start to get very sticky.

"We all know that hardware, firmware, and software have bugs even the most well researched, well assessed, widely hacked pieces of tech such as the smartphone regularly has bug updates, security fixes, and emergency patches. Bug-free code is hard, and it shouldn't be considered that the control systems for QKD are any different," Carney insists.

In other words, it's all well and good having a perfected quantum protocol, but if someone can do memory analysis on A or B's systems, then your "super secure" key can get pwned. "It's monumentally naive in my view that the companies producing QKD tech don't take this head on," Dr Carney concludes. "Hiding behind 'magic quantum woo-woo security' is only going to go so far before people start realising."

Professor Rob Young, director of the Quantum Technology Centre at Lancaster University, agrees that there is a gap between an ideal QKD implementation and a real system, as putting the theory into practice isn't easy without making compromises.

QKD connections can be blocked using a DDoS attack as simple as using a pneumatic drill in the vicinity of the cable

"When you generate the states to send from the transmitter," he explains, "errors are made, and detecting them at the receiver efficiently is challenging. Security proofs typically rely on a long list of often unmet assumptions in the real world."

Then there are the hardware limitations, with most commercially implemented QKD systems using a discrete-state protocol sending single photons down low-loss fibres. "Photons can travel a surprising distance before being absorbed, but it means that the data exchange rate falls off exponentially with distance," Young says.

"Nodes in networks need to be trusted currently, as we can't practically relay or switch quantum channels without trusting the nodes. Solutions to these problems are in development, but they could be years away from commercial implementation."

This lack of quantum repeaters is a red flag, according to Duncan Jones, head of Quantum Cybersecurity at Cambridge Quantum, who warns that "trusted repeaters" are not the same thing. "In most cases this simply means a trusted box which reads the key material from one fibre cable and re-transmits it down another. This is not a quantum-safe approach and negates the security benefits of QKD."

Then there's the motorway junction conundrum. Over to Andersen Cheng, CEO at Post-Quantum, to explain. Cheng points to problems such as QKD only telling you that a person-in-the-middle attack has happened, with photons disturbed because of the interception, but not where that attack is taking place or how many attacks are happening.

"If someone is going to put a tap along your 150km high-grade clear fibre-optic cable, how are you going to locate and weed out those taps quickly?" Cheng asks.

What if an attacker locates your cable grid and cuts a cable off? Where is the contingency for redundancy to ensure no disruption? This is where the motorway junction conundrum comes in.

"QKD is like two junctions of a motorway," Cheng explains. "You know car accidents are happening because the road surface is being attacked, but you do not know how many accidents have happened or where or who the culprit is, so you cannot go and kick the offenders out and patch up the road surface."

This all comes to the fore when Anderson insists: "QKD connections can be blocked using a DDoS attack as simple as using a pneumatic drill in the vicinity of the cable."

Sally Epstein, head of Strategic Technology at Cambridge Consultants, throws a couple of pertinent questions into the "ask any QKD vendor" ring.

Quantum-safe cryptography, coupled with verifiable quantum key generation, is an excellent approach to the same problem and works perfectly today

"1. Supply chain: There is a much greater potential for well-funded bad actors to get into the supply chain. How do they manage their supply chain security?

"2. Human fallibility: There are almost certainly exploitable weaknesses in the control software, optical sub-assemblies, electronic, firmware, etc. What penetration testing has the supplier conducted in terms of software and hardware?"

Professor Young thinks that QKD currently offers little return on investment for your average enterprise. "QKD can distribute keys with provable security metrics, but current systems are expensive, slow and difficult to implement," he says.

As has already been pointed out, security proofs are generally based on ideal cases without taking the actual physical implementation into account. This, Young says, "troubles the central premise of using QKD in the first place."

However, he doesn't think that the limitations are fundamental and sees an exciting future for the technology.

Because QKD technology is still maturing, and keys can only be sent across relatively short distances using dedicated fibre-optic cables, Jones argues that "only the biggest enterprises and telcos should be spending any money on researching this technology today."

Not least, he says, because the problems QKD solves are equally well addressed through different means. "Quantum-safe cryptography, coupled with verifiable quantum key generation, is an excellent approach to the same problem and works perfectly today," Jones concludes.

Professor Andrew Lord, head of Optical Network Research at BT, has a less pessimistic outlook.

"Our trial with NCC in Bristol illustrates a client with a need to transmit data which should remain secure for many years into the future," Lord told The Reg. "QKD is attractive here because it provides security against the 'tap now, decrypt later' risk, where data could be stored and decrypted when a quantum computer becomes available."

The UK's National Cyber Security Centre (NCSC) has gone on the record to state it does not endorse the use of QKD for any government or military application, and the National Security Agency (NSA) in the US has reached the same conclusion.

Jones of Cambridge Quantum says he completely agrees with the NCSC/NSA perspectives because the "first generation of quantum security technologies has failed to deliver tangible benefits for commercial or government applications."

Young goes further: "Both NCSC and NSA echo the views of all serious cryptographers with regards to QKD, and I am in complete agreement with them."

So what needs to change to make QKD solutions relevant to enterprises in the real world? Lord admits that the specialised hardware requirements of QKD does mean it won't be the best solution for all use cases, but foresees "photonic-chip based QKD ultimately bringing the price down to a point where it can be integrated into standard optical transmission equipment."

Dr Carney adds: "In closing, all this leaves us with the biggest misunderstanding about QKD vs classical key exchange; in classical key exchange the mathematics that makes Elliptic Curve Diffie-Hellman Ephemeral (ECDHE) or your favourite Post-Quantum Cryptography (PQC) key exchange secure is distinct and independent of the physical channel (the classical channel) that is being used for the protocol.

"On a QKD system, the mathematics is in some way intrinsically, and necessarily, linked to the actual physicality of the system. This situation is unavoidable, and we would do well to design for and around it."

Here is the original post:
Quantum Key Distribution: Is it as secure as claimed and what can it offer the enterprise? - The Register

Conclusions from Forum TERATEC 2021: European Cooperation, Novel Uses of HPC – HPCwire

July 1, 2021 As the world enters the quantum era while politicians define the future face of a digital Europe, High Performance Computing (HPC) shape the necessary and expected post-Covid rebound. Held from June 22 to 24, 2021, the 16th Forum Teratec highlighted the major challenges facing the entire HPC sector and the European community: autonomous production of supercomputers, democratization of HPC uses, and pooling of knowledge and skills at the European level.

Democratization of HPC use across businesses

As participating companies have shown us during thisTeratecForum, supercomputers are becoming increasingly popular in several sectors even outside industry, such as medical optics for smart glasses, and archaeology for large-scale, ultra-precise surveys However, these are still unsettled uses.

Industries in all sectors see supercomputers as a possible solution to the complex problems encountered by their customers along with new products yet to be created to address them.

With growing number and diversity of users of High Performance Computing (HPC) as for data storage consequently, new challenges will appear to manufacturers and technology providers.

The increase in demand will match an increase in computer power and, consistently in energy consumption and computer costs. As President ofTeratec, DanielVerwaerde, points out: In the next few decades, the world of supercomputers must be able to offer solutions closely reaching carbon neutrality.

If they want to offer seamless interfaces without loss of performance and from a sole technical standpoint, manufacturers will need to ensure that data is managed consistently between conventional processor, accelerator, and coprocessor.

Minister Delegate to the Minister of Economy, Finance and Recovery, in charge of Industry, Agns Pannier-Runacherrecalled that if quantum technology is to bring the promise of a major technological breakthrough by shortening computing times by a factor of 1 billion within 5 to 10 years, the investment will of course have to be made in hardware equipment as planned by the French Quantum Plan (1.8 billion euros over five years) beside European projects, also enabling to engage accelerators and quantum computers in operation within computing centers.

As reported by DanielVerwaerde: Failing to work in these three areas simultaneously, the investments made will be all the more useless because they are so important.

Europe-wide cooperation

Whether France aims to be among leaders in these technologies and particularly in quantum computing requires cooperation on a European scale. Even if France indeed has been pursuing a proactive policy in this field since the 1960s, financial stakes for the next generations of supercomputers are such that the nation cannot act alone. Since French and European policies are in line with each other, the expertise acquired over decades can only give France a leading role.

As DanielVerwaerdereaffirmed: for supercomputers are strategic tools for our European development and our collective security, failing to be autonomous in this field would be a serious economic handicap (supercomputers are high value-added products) but also a societal one (production generates jobs, from most qualified to other many technical and manufacturing skills).

Anders Dam Jensen, Executive Director ofEuroHPC, recalled the missions of the joint European venture: to provide Europe with eight supercomputers ranked, if possible, in the top five in the world which will enable it to compete on equal footing with its competitors, and to develop a complete supply chain so that Europe can be autonomous in the production of such supercomputers. One among deadlines is to produce a European-based technology computer over the next call for tenders for the production of exaflops computers starting in 2023.

This collaboration is fully expressed in the future implementation of one interconnection between all the major European computing centers, planned after 2025. National competence centers will be referenced for each member country so that all industrial companies have access to high-performance computing, including SMEs and government agencies. In France,Teratechas been designated by the government andEuroHPCto be the national competence center, in cooperation withCerfacs(European Center for Research and Advanced Training in Scientific Computing) and GENCI (GrandEquipementNational deCalculIntensif).

We are at a turning point [] and this is where European involvement is particularly important, saidHervMouren, Director ofTeratec. And DanielVerwaerdeadded: Investing in the European project increase chances that the policy decided by Europe will be the one that France needs.

To be released, a summary of various presentations at Teratec:

A summary will soon be available to outline the richness of interventions that took place during the 16th Forum Teratec.

Workshops were also held on targeted topics: hybrid quantum computing, communicable diseases, cyber threats, satellite data for the environment and climate, autonomous systems and HPC storage.

Finally, focusing on technological challenges of high-performance simulation and the diversity of uses of HPC, the roundtables reviewed:

The next Forum Teratec 2022 will be held June 14 & 15, 2022.

Find more information on Forum 2021 here:

https://teratec.eu/forum/index.html

Source: TERATEC

Original post:
Conclusions from Forum TERATEC 2021: European Cooperation, Novel Uses of HPC - HPCwire

IBM researchers demonstrate the advantage that quantum computers have over classical computers – ZDNet

Big Blue's quantum team set out to discover if today's quantum devices could be used to complete a task that cannot be done on a classical system.

IBM researchers have finally proven in a real-world experiment that quantum computers are superior to classical devices although for now, only at a miniature scale.

Big Blue's quantum team set out to discover if today's quantum devices, despite their limitations, could be used to complete a task that cannot be done on a classical system.

Since quantum computing is still in its infancy, the researchers leveled the playing field between the two methods by designing a microscopic experiment with limited space that is, limited amount of available memory.

Two limited-space circuits were built, one quantum and one classical, with only one bit or qubit available for computation and result storage. The task programmed into the circuits consisted of finding the majority out of three input bits, returning zero if more than half of the bits are zero, and one if more than half of the bits are one.

The restrictions, said the scientists, enabled a fair comparison between the power of classical and quantum space when carrying out a calculation.

"Through our research, we're exploring a very simple question,"said IBM's quantum team in a blog post."How does the computational power differ when a computer has access to classical scratch space versus quantum scratch space?"

Equipped with a single bit for computation and storage, the classical system is not capable of running the algorithm, theorized the scientists. Even when giving the system's computational capabilities a boost by adding what is known as random Boolean gates, the classical computer only succeeded 87.5% of the time.

Quantum devices, on the other hand, fared better: a perfect, noiseless quantum computer could succeed 100% of the time, said the scientists in their theoretical demonstration.

This is because, unlike classical bits that can either represent a 1 or a 0, qubits can take on a combination of various states at once, meaning that they have access to a larger space of values. In other words, quantum space is more valuable than classical space.

The theory, however, is still some distance away from reality. Current quantum computers are still too noisy to achieve the perfect results demonstrated by the scientists in their paper. But when carrying out the experiment in real-life, with circuits calibrated to run the program more efficiently, IBM's team still observed a success rate of 93%, which beats the classical system.

"We show that qubits, even today's noisy qubits, offer more value than bits as a medium of storage during computations," said the scientists.

This means that even today's noisy quantum computers can offer better performance on the problem than the theoretical maximum performance of a classical device, suggesting that as the technology evolves, the performance gap with classical devices will only widen.

Big Blue's quantum team claims that this is a world-first demonstration of quantum advantage, because the theory is backed by a real-life experiment.

To date, research projects are concerned with proving a theoretical quantum advantage that can only be demonstrated when the hardware is mature enough to run large-scale programs, according to the scientists.

Fromimproving car manufacturing supply chainstooptimizing the routes of merchant ships around the world's oceans: there is no shortage of ideas when it comes to researching how quantum computing could create business value. But for now, scientists are mostly finding that quantum technologies are comparable to classical systems for small-scale problems, and only theorizing that quantum devices will eventually deliver an advantage as the computers develop.

"Here, for the first time that we are aware of, we report a simultaneous proof and experimental verification of a new kind of quantum advantage," said IBM's researchers.

As quantum hardware improves, these experimental verifications are expected to expand from tests carried out at the level of single bits. IBMrecently unveiled a quantum roadmap for the next few years, which includes a 1,121-qubit system to be built by 2023, on track to creating systems supporting more than one million qubits in the longer-term.

Continued here:
IBM researchers demonstrate the advantage that quantum computers have over classical computers - ZDNet

Missing Piece Discovered in the Puzzle of Optical Quantum Computing – SciTechDaily

By Washington University In St. LouisJune 30, 2021

Jung-Tsung Shen, associate professor in the Department of Electrical & Systems Engineering, has developed a deterministic, high-fidelity, two-bit quantum logic gate that takes advantage of a new form of light. This new logic gate is orders of magnitude more efficient than the current technology. Credit: Jung-Tsung Shen

An efficient two-bit quantum logic gate has been out of reach, until now.

Research from the McKelvey School of Engineering at Washington University in St. Louis has found a missing piece in the puzzle of optical quantum computing.

Jung-Tsung Shen, associate professor in the Preston M. Green Department of Electrical & Systems Engineering, has developed a deterministic, high-fidelity two-bit quantum logic gate that takes advantage of a new form of light. This new logic gate is orders of magnitude more efficient than the current technology.

In the ideal case, the fidelity can be as high as 97%, Shen said.

His research was published in May 2021 in the journalPhysical Review A.

The potential of quantum computers is bound to the unusual properties of superposition the ability of a quantum system to contain many distinct properties, or states, at the same time and entanglement two particles acting as if they are correlated in a non-classical manner, despite being physically removed from each other.

Where voltage determines the value of a bit (a 1 or a 0) in a classical computer, researchers often use individual electrons as qubits, the quantum equivalent. Electrons have several traits that suit them well to the task: they are easily manipulated by an electric or magnetic field and they interact with each other. Interaction is a benefit when you need two bits to be entangled letting the wilderness of quantum mechanics manifest.

But their propensity to interact is also a problem. Everything from stray magnetic fields to power lines can influence electrons, making them hard to truly control.

For the past two decades, however, some scientists have been trying to use photons as qubits instead of electrons. If computers are going to have a true impact, we need to look into creating the platform using light, Shen said.

Photons have no charge, which can lead to the opposite problems: they do not interact with the environment like electrons, but they also do not interact with each other. It has also been challenging to engineer and to create ad hoc (effective) inter-photon interactions. Or so traditional thinking went.

Less than a decade ago, scientists working on this problem discovered that, even if they werent entangled as they entered a logic gate, the act of measuring the two photons when they exited led them to behave as if they had been.The unique features of measurement are another wild manifestation of quantum mechanics.

Quantum mechanics is not difficult, but its full of surprises, Shen said.

The measurement discovery was groundbreaking, but not quite game-changing. Thats because for every 1,000,000 photons, only one pair became entangled. Researchers have since been more successful, but, Shen said, Its still not good enough for a computer, which has to carry out millions to billions of operations per second.

Shen was able to build a two-bit quantum logic gate with such efficiency because of the discovery of a new class of quantum photonic states photonic dimers, photons entangled in both space and frequency. His prediction of their existence was experimentally validated in 2013, and he has since been finding applications for this new form of light.

When a single photon enters a logic gate, nothing notable happens it goes in and comes out. But when there are two photons, Thats when we predicted the two can make a new state, photonic dimers. It turns out this new state is crucial.

High-fidelity, two-bit logic gate, designed by Jung-Tsung Shen. Credit: Jung-Tsung Shen

Mathematically, there are many ways to design a logic gate for two-bit operations. These different designs are called equivalent. The specific logic gate that Shen and his research group designed is the controlled-phase gate (or controlled-Z gate). The principal function of the controlled-phase gate is that the two photons that come out are in the negative state of the two photons that went in.

In classical circuits, there is no minus sign, Shen said. But in quantum computing, it turns out the minus sign exists and is crucial.

Quantum mechanics is not difficult, but its full of surprises.

Jung-Tsung Shen

When two independent photons (representing two optical qubits) enter the logic gate, The design of the logic gate is such that the two photons can form a photonic dimer, Shen said. It turns out the new quantum photonic state is crucial as it enables the output state to have the correct sign that is essential to the optical logic operations.

Shen has been working with the University of Michigan to test his design, which is a solid-state logic gate one that can operate under moderate conditions. So far, he says, results seem positive.

Shen says this result, while baffling to most, is clear as day to those in the know.

Its like a puzzle, he said. It may be complicated to do, but once its done, just by glancing at it, you will know its correct.

Reference: Two-photon controlled-phase gates enabled by photonic dimers by Zihao Chen, Yao Zhou, Jung-Tsung Shen, Pei-Cheng Ku and Duncan Steel, 21 May 2021, Physical Review A.DOI: 10.1103/PhysRevA.103.052610

This research was supported by the National Science Foundation, ECCS grants nos. 1608049 and 1838996. It was also supported by the 2018 NSF Quantum Leap (RAISE) Award.

The McKelvey School of Engineering at Washington University in St. Louis promotes independent inquiry and education with an emphasis on scientific excellence, innovation and collaboration without boundaries. McKelvey Engineering has top-ranked research and graduate programs across departments, particularly in biomedical engineering, environmental engineering and computing, and has one of the most selective undergraduate programs in the country. With 140 full-time faculty, 1,387 undergraduate students, 1,448 graduate students and 21,000 living alumni, we are working to solve some of societys greatest challenges; to prepare students to become leaders and innovate throughout their careers; and to be a catalyst of economic development for the St. Louis region and beyond.

Here is the original post:
Missing Piece Discovered in the Puzzle of Optical Quantum Computing - SciTechDaily

NIST’s Quantum Security Protocols Near the Finish Line The U.S. standards and technology authority is searching – IoT World Today

The U.S. standards and technology authority is searching for a new encryption method to prevent the Internet of Things succumbing to quantum-enabled hackers

As quantum computing moves from academic circles to practical uses, it is expected to become the conduit for cybersecurity breaches.

The National Institute of Standards and Technology aims to nip these malicious attacks preemptively. Its new cybersecurity protocols would help shield networks from quantum computing hacks.

National Institute of Standards and Technology (NIST) has consulted with cryptography thought leaders on hardware and software options to migrate existing technologies to post-quantum encryption.

The consultation forms part of a wider national contest, which is due to report back with its preliminary shortlist later this year.

IT pros can download and evaluate the options through the open source repository at NISTs Computer Security Resource Center.

[The message] is to educate the market but also to try to get people to start playing around with [quantum computers] and understanding it because, if you wait until its a Y2K problem, then its too late, said Chris Sciacca, IBMs communications manager for research in Europe, Middle East, Africa, Asia and South America. So the message here is to start adopting some of these schemes.

Businesses need to know how to contend with quantum decryption, which could potentially jeopardize many Internet of Things (IoT) endpoints.

Quantum threatens society because IoT, in effect, binds our digital and physical worlds together. Worryingly, some experts believe hackers could already be recording scrambled IoT transmissions, to be ready when quantum decryption arrives.

Current protocols such as Transport Layer Security (TLS) will be difficult to upgrade, as they are often baked into the devices circuitry or firmware,

Estimates for when a quantum computer capable of running Shors algorithm vary. An optimist in the field would say it may take 10 to 15 years. But then it could be another Y2K scenario, whose predicted problems never came to pass.

But its still worth getting the enterprises IoT network ready, to be on the safe side.

Broadly speaking, all asymmetric encryption thats in common use today will be susceptible to a future quantum computer with adequate quantum volume, said Christopher Sherman, a senior analyst at Forrester Research, Anything that uses prime factorization or discrete log to create separate encryption and decryption keys, those will all be vulnerable to a quantum computer potentially within the next 15 years.

Why Do We Need Quantum Security?

Quantum computers would answer queries existing technologies cannot resolve, by applying quantum mechanics to compute various combinations of data simultaneously.

As the quantum computing field remains largely in the prototyping phase, current models largely perform only narrow scientific or computational objectives.

All asymmetric cryptography systems, however, could one day be overridden by a quantum mechanical algorithm known as Shors algorithm.

Thats because the decryption ciphers rely on mathematical complexities such as factorization, which Shors could hypothetically unravel in no time.

In quantum physics, what you can do is construct a parameter that cancels some of the probabilities out, explained Luca De Feo, a researcher at IBM who is involved with the NIST quantum-security effort, Shors algorithm is such an apparatus. It makes many quantum particles interact in such a way that the probabilities of the things you are not interested in will cancel out.

Will Quantum Decryption Spell Disaster For IoT?

Businesses must have safeguards against quantum decryption, which threatens IoT endpoints secured by asymmetric encryption.

A symmetric encryption technique, Advanced Encrypton Standard, is believed to be immune to Shors algorithm attacks, but is considered computationally expensive for resource-constrained IoT devices.

For businesses looking to quantum-secure IoT in specific verticals, theres a risk assessment model published by University of Waterloos quantum technology specialist Dr. Michele Mosca. The model is designed to predict the risk and outline times for preparing a response,depending on the kind of organization involved.

As well as integrating a new quantum security standard, theres also a need for mechanisms to make legacy systems quantum-secure. Not only can encryption be broken, but theres also potential for quantum forgeries of digital identities, in sectors such as banking.

I see a lot of banks now asking about quantum security, and definitely governments, Sherman said, They are not just focused on replacing RSA which includes https and TLS but also elliptic curve cryptography (ECC), for example blockchain-based systems. ECC-powered digital signatures will need to be replaced as well.

One option, which NIST is considering, is to blend post-quantum security at network level with standard ciphers on legacy nodes. The latter could then be phased out over time.

A hybrid approach published by NIST guidance around using the old protocols that satisfy regulatory requirements at a security level thats been certified for a given purpose, Sherman said, But then having an encapsulation technique that puts a crypto technique on top of that. It wraps up into that overall encryption scheme, so that in the future you can drop one thats vulnerable and just keep the post-quantum encryption.

Governments Must Defend Against Quantum Hacks

For national governments, its becoming an all-out quantum arms race. And the U.S. may well be losing. Russia and China have both already unveiled initial post-quantum security options, Sherman said.

They finished their competitions over the past couple of years. I wouldnt be surprised if the NIST standard also becomes something that Europe uses, he added.

The threats against IoT devices have only grown more pronounced with current trends.

More virtual health and connected devices deployed in COVID-19, for example, will mean more medical practices are now quantum-vulnerable.

According to analyst firm Omdia, there are three major fault lines in defending the IoT ecosystem: endpoint security, network security and public cloud security. With 46 billion things currently in operation globally, IoT already provides an enlarged attack surface for cybercriminals.

The challenge is protecting any IoT device thats using secure communications or symmetric protocols, said Sherman, Considering that by, 2025 theres over a trillion IoT devices expected to be deployed. Thats obviously quite large in terms of potential exposure. Wherever RSA or TLS is being used with IoT, theres a threat.

Weighing Up Post-Quantum And Quantum Cryptography Methods

Post-quantum cryptography differs from methods such as quantum key distribution (QKD), which use quantum mechanics to secure technology against the coming threat.

QKD is already installed on some government and research communications lines, and hypothetically its impenetrable.

But the average business needs technology that can be implemented quickly and affordably. And, as we dont even know how a quantum decryption device would work in practice, its unrealistic to transfer QKD onto every IoT network.

One of the main post-quantum cryptography standards in the frame is lattice-based cryptography, an approach that is thought to be more resilient against Shors algorithm.

While these are still based on mathematics and could be endangered by future quantum decryption algorithms, they might buy scientists enough time to come up with other economically viable techniques.

Another advantage would be in IoT applications that need the point-to-point security channel, such as connected vehicles, De Feo said.

Probably the lattice-based schemes are the best right now to run on IoT devices. Some efforts will be needed in the chip design process to make these even easier to run, he added, But we should probably start thinking about this right now. Because it will probably take around five-to-seven years after the algorithms have been found for the chips to reach peoples homes or industrial systems.

And then potentially [if the optimistic estimates are right,] quantum computers will have arrived.

Here is the original post:
NIST's Quantum Security Protocols Near the Finish Line The U.S. standards and technology authority is searching - IoT World Today