Archive for the ‘Quantum Computer’ Category

Caltech and NTT developing the world’s fastest quantum computer – Digital Journal

NTT Research has announced a collaboration with Caltech to develop the worlds fastest Coherent Ising Machine (CIM). This relates to a quantum-oriented computing approach that uses special-purpose processors to solve extremely complex combinatorial optimization problems. CIMs are advanced devices that constitute a promising approach to solving optimization problems by mapping them to ground state searches. The primary application of the computing method is drug discovery. Developing new drugs is of importance, including the current fight against COVID-19. Drug discovery is a commonly cited combinatorial optimization problem. The search for effective drugs involves an enormous number of potential matches between medically appropriate molecules and target proteins that are responsible for a specific disease. Conventional computers are used to replicate chemical interactions in the medical space and other areas of life and chemical sciences. To really move forwards, quantum technology is required to take developments beyond trial and error to rapidly tackle the sheer volume of total possible combinations.Other applications of the technology include:LogisticsOne classic problem is that of the traveling salesman (a common logic problem) identifying the shortest possible route that visits each of n number of cities, while returning to the city of origin. This problem and its variants appear in contemporary form in logistical challenges, such as daily automotive traffic patterns. The advantage of using a quantum information system is speed. Machine LearningA CIM is also a good match for some types of machine learning, including image and speech recognition. Artificial neural networks learn by iteratively processing examples containing known inputs and results. CIMs can speed up the training and improve upon the accuracy of existing neural networks.The development of the new computer system has been pioneered by Kazuhiro Gomi, CEO of NTT Research, and Dr. Yoshihisa Yamamoto, Director of NTT Researchs Physics & Informatics (PHI) Lab, who is overseeing this research. This is a step forwards in CIM optimization problems by uniting perspectives from statistics, computer science, statistical physics and quantum optics.

Read the rest here:
Caltech and NTT developing the world's fastest quantum computer - Digital Journal

University of Glasgow Partners with Oxford Instruments NanoScience on Quantum Computing – HPCwire

Jan. 21, 2021 Today, the University of Glasgow, a pioneering institution in quantum technology development and home of the Quantum Circuits Group, announced its using Oxford Instruments next generation Cryofree refrigerator, Proteox, as part of its research to accelerate the commercialisation of quantum computing in the UK.

Were excited to be using Proteox, the latest in cryogen-free refrigeration technology, and to have the system up and running in our lab, comments Professor Martin Weides, Head of the Quantum Circuits Group. Oxford Instruments is a long-term strategic partner and todays announcement highlights the importance of our close collaboration to the future of quantum computing development. Proteox is designed with quantum scale-up in mind, and through the use of its Secondary Insert technology, were able to easily characterise and develop integrated chips and components for quantum computing applications.

The University of Glasgow, its subsidiary and commercialisation partner, Kelvin Nanotechnology, and Oxford Instruments NanoScience are part of a larger consortium supported by funding from Innovate UK, the UKs innovation agency, granted in April 2020. The consortium partners will boost quantum technology development by the design, manufacture, and test of superconducting quantum devices.

Todays announcement demonstrates the major contribution Oxford Instruments is making towards pioneering quantum technology work in the UK, states Stuart Woods, Managing Director of Oxford Instruments NanoScience. With our 60 years of experience of in-house component production and global service support, we are accelerating the commercialisation of quantum to discover whats next supporting our customers across the world.

Proteox is a next-generation Cryofree system that provides a step change in modularity and adaptability for ultra-low temperature experiments in condensed-matter physics and quantum computing industrialisation. The Proteox platform has been developed to provide a single, interchangeable modular solution that can support multiple users and a variety of set-ups or experiments. It also includes remote management software which is integral to the system design, enabling, for example, the system to be managed from anywhere in the world. To find out more, visit nanoscience.oxinst.com/proteox.

About Oxford Instruments NanoScience

Oxford Instruments NanoScience designs, supplies and supports market-leading research tools that enable quantum technologies, new materials and device development in the physical sciences. Our tools support research down to the atomic scale through creation of high performance, cryogen-free low temperature and magnetic environments, based upon our core technologies in low and ultra-low temperatures, high magnetic fields and system integration, with ever-increasing levels of experimental and measurement readiness.Oxford Instruments NanoScience is a part of the Oxford Instruments plc group.

Glasgows Quantum Circuit Group is found here: https://www.gla.ac.uk/schools/engineering/research/divisions/ene/researchthemes/micronanotechnology/quantumcircuits/

Source: University of Glasgow

More:
University of Glasgow Partners with Oxford Instruments NanoScience on Quantum Computing - HPCwire

The Worldwide Quantum Computing Industry will Exceed $7.1 Billion by 2026 – GlobeNewswire

Dublin, Jan. 19, 2021 (GLOBE NEWSWIRE) -- The "Quantum Computing Market by Technology, Infrastructure, Services, and Industry Verticals 2021 - 2026" report has been added to ResearchAndMarkets.com's offering.

This report assesses the technology, companies/organizations, R&D efforts, and potential solutions facilitated by quantum computing. The report provides global and regional forecasts as well as the outlook for quantum computing impact on infrastructure including hardware, software, applications, and services from 2021 to 2026. This includes the quantum computing market across major industry verticals.

While classical (non-quantum) computers make the modern digital world possible, there are many tasks that cannot be solved using conventional computational methods. This is because of limitations in processing power. For example, fourth-generation computers cannot perform multiple computations at one time with one processor. Physical phenomena at the nanoscale indicate that a quantum computer is capable of computational feats that are orders of magnitude greater than conventional methods.

This is due to the use of something referred to as a quantum bit (qubit), which may exist as a zero or one (as in classical computing) or may exist in two-states simultaneously (0 and 1 at the same time) due to the superposition principle of quantum physics. This enables greater processing power than the normal binary (zero only or one only) representation of data.

Whereas parallel computing is achieved in classical computers via linking processors together, quantum computers may conduct multiple computations with a single processor. This is referred to as quantum parallelism and is a major difference between hyper-fast quantum computers and speed-limited classical computers.

Quantum computing is anticipated to support many new and enhanced capabilities including:

Target Audience:

Select Report Findings:

Report Benefits:

Key Topics Covered:

1.0 Executive Summary

2.0 Introduction

3.0 Technology and Market Analysis3.1 Quantum Computing State of the Industry3.2 Quantum Computing Technology Stack3.3 Quantum Computing and Artificial Intelligence3.4 Quantum Neurons3.5 Quantum Computing and Big Data3.6 Linear Optical Quantum Computing3.7 Quantum Computing Business Model3.8 Quantum Software Platform3.9 Application Areas3.10 Emerging Revenue Sectors3.11 Quantum Computing Investment Analysis3.12 Quantum Computing Initiatives by Country3.12.1 USA3.12.2 Canada3.12.3 Mexico3.12.4 Brazil3.12.5 UK3.12.6 France3.12.7 Russia3.12.8 Germany3.12.9 Netherlands3.12.10 Denmark3.12.11 Sweden3.12.12 Saudi Arabia3.12.13 UAE3.12.14 Qatar3.12.15 Kuwait3.12.16 Israel3.12.17 Australia3.12.18 China3.12.19 Japan3.12.20 India3.12.21 Singapore

4.0 Quantum Computing Drivers and Challenges4.1 Quantum Computing Market Dynamics4.2 Quantum Computing Market Drivers4.2.1 Growing Adoption in Aerospace and Defense Sectors4.2.2 Growing investment of Governments4.2.3 Emergence of Advance Applications4.3 Quantum Computing Market Challenges

5.0 Quantum Computing Use Cases5.1 Quantum Computing in Pharmaceuticals5.2 Applying Quantum Technology to Financial Problems5.3 Accelerate Autonomous Vehicles with Quantum AI5.4 Car Manufacturers using Quantum Computing5.5 Accelerating Advanced Computing for NASA Missions

6.0 Quantum Computing Value Chain Analysis6.1 Quantum Computing Value Chain Structure6.2 Quantum Computing Competitive Analysis6.2.1 Leading Vendor Efforts6.2.2 Start-up Companies6.2.3 Government Initiatives6.2.4 University Initiatives6.2.5 Venture Capital Investments6.3 Large Scale Computing Systems

7.0 Company Analysis7.1 D-Wave Systems Inc.7.1.1 Company Overview:7.1.2 Product Portfolio7.1.3 Recent Development7.2 Google Inc.7.2.1 Company Overview:7.2.2 Product Portfolio7.2.3 Recent Development7.3 Microsoft Corporation7.3.1 Company Overview:7.3.2 Product Portfolio7.3.3 Recent Development7.4 IBM Corporation7.4.1 Company Overview:7.4.2 Product Portfolio7.4.3 Recent Development7.5 Intel Corporation7.5.1 Company Overview7.5.2 Product Portfolio7.5.3 Recent Development7.6 Nokia Corporation7.6.1 Company Overview7.6.2 Product Portfolio7.6.3 Recent Developments7.7 Toshiba Corporation7.7.1 Company Overview7.7.2 Product Portfolio7.7.3 Recent Development7.8 Raytheon Company7.8.1 Company Overview7.8.2 Product Portfolio7.8.3 Recent Development7.9 Other Companies7.9.1 1QB Information Technologies Inc.7.9.1.1 Company Overview7.9.1.2 Recent Development7.9.2 Cambridge Quantum Computing Ltd.7.9.2.1 Company Overview7.9.2.2 Recent Development7.9.3 QC Ware Corp.7.9.3.1 Company Overview7.9.3.2 Recent Development7.9.4 MagiQ Technologies Inc.7.9.4.1 Company Overview7.9.5 Rigetti Computing7.9.5.1 Company Overview7.9.5.2 Recent Development7.9.6 Anyon Systems Inc.7.9.6.1 Company Overview7.9.7 Quantum Circuits Inc.7.9.7.1 Company Overview7.9.7.2 Recent Development7.9.8 Hewlett Packard Enterprise (HPE)7.9.8.1 Company Overview7.9.8.2 Recent Development7.9.9 Fujitsu Ltd.7.9.9.1 Company Overview7.9.9.2 Recent Development7.9.10 NEC Corporation7.9.10.1 Company Overview7.9.10.2 Recent Development7.9.11 SK Telecom7.9.11.1 Company Overview7.9.11.2 Recent Development7.9.12 Lockheed Martin Corporation7.9.12.1 Company Overview7.9.13 NTT Docomo Inc.7.9.13.1 Company Overview7.9.13.2 Recent Development7.9.14 Alibaba Group Holding Limited7.9.14.1 Company Overview7.9.14.2 Recent Development7.9.15 Booz Allen Hamilton Inc.7.9.15.1 Company Overview7.9.16 Airbus Group7.9.16.1 Company Overview7.9.16.2 Recent Development7.9.17 Amgen Inc.7.9.17.1 Company Overview7.9.17.2 Recent Development7.9.18 Biogen Inc.7.9.18.1 Company Overview7.9.18.2 Recent Development7.9.19 BT Group7.9.19.1 Company Overview7.9.19.2 Recent Development7.9.20 Mitsubishi Electric Corp.7.9.20.1 Company Overview7.9.21 Volkswagen AG7.9.21.1 Company Overview7.9.21.2 Recent Development7.9.22 KPN7.9.22.1 Recent Development7.10 Ecosystem Contributors7.10.1 Agilent Technologies7.10.2 Artiste-qb.net7.10.3 Avago Technologies7.10.4 Ciena Corporation7.10.5 Eagle Power Technologies Inc7.10.6 Emcore Corporation7.10.7 Enablence Technologies7.10.8 Entanglement Partners7.10.9 Fathom Computing7.10.10 Alpine Quantum Technologies GmbH7.10.11 Atom Computing7.10.12 Black Brane Systems7.10.13 Delft Circuits7.10.14 EeroQ7.10.15 Everettian Technologies7.10.16 EvolutionQ7.10.17 H-Bar Consultants7.10.18 Horizon Quantum Computing7.10.19 ID Quantique (IDQ)7.10.20 InfiniQuant7.10.21 IonQ7.10.22 ISARA7.10.23 KETS Quantum Security7.10.24 Magiq7.10.25 MDR Corporation7.10.26 Nordic Quantum Computing Group (NQCG)7.10.27 Oxford Quantum Circuits7.10.28 Post-Quantum (PQ Solutions)7.10.29 ProteinQure7.10.30 PsiQuantum7.10.31 Q&I7.10.32 Qasky7.10.33 QbitLogic7.10.34 Q-Ctrl7.10.35 Qilimanjaro Quantum Hub7.10.36 Qindom7.10.37 Qnami7.10.38 QSpice Labs7.10.39 Qu & Co7.10.40 Quandela7.10.41 Quantika7.10.42 Quantum Benchmark Inc.7.10.43 Quantum Circuits Inc. (QCI)7.10.44 Quantum Factory GmbH7.10.45 QuantumCTek7.10.46 Quantum Motion Technologies7.10.47 QuantumX7.10.48 Qubitekk7.10.49 Qubitera LLC7.10.50 Quintessence Labs7.10.51 Qulab7.10.52 Qunnect7.10.53 QuNu Labs7.10.54 River Lane Research (RLR)7.10.55 SeeQC7.10.56 Silicon Quantum Computing7.10.57 Sparrow Quantum7.10.58 Strangeworks7.10.59 Tokyo Quantum Computing (TQC)7.10.60 TundraSystems Global Ltd.7.10.61 Turing7.10.62 Xanadu7.10.63 Zapata Computing7.10.64 Accenture7.10.65 Atos Quantum7.10.66 Baidu7.10.67 Northrop Grumman7.10.68 Quantum Computing Inc.7.10.69 Keysight Technologies7.10.70 Nano-Meta Technologies7.10.71 Optalysys Ltd.

8.0 Quantum Computing Market Analysis and Forecasts 2021 - 20268.1.1 Quantum Computing Market by Infrastructure8.1.1.1 Quantum Computing Market by Hardware Type8.1.1.2 Quantum Computing Market by Application Software Type8.1.1.3 Quantum Computing Market by Service Type8.1.1.3.1 Quantum Computing Market by Professional Service Type8.1.2 Quantum Computing Market by Technology Segment8.1.3 Quantum Computing Market by Industry Vertical8.1.4 Quantum Computing Market by Region8.1.4.1 North America Quantum Computing Market by Infrastructure, Technology, Industry Vertical, and Country8.1.4.2 European Quantum Computing Market by Infrastructure, Technology, and Industry Vertical8.1.4.3 Asia-Pacific Quantum Computing Market by Infrastructure, Technology, and Industry Vertical8.1.4.4 Middle East & Africa Quantum Computing Market by Infrastructure, Technology, and Industry Vertical8.1.4.5 Latin America Quantum Computing Market by Infrastructure, Technology, and Industry Vertical

9.0 Conclusions and Recommendations

10.0 Appendix: Quantum Computing and Classical HPC10.1 Next Generation Computing10.2 Quantum Computing vs. Classical High-Performance Computing10.3 Artificial Intelligence in High Performance Computing10.4 Quantum Technology Market in Exascale Computing

For more information about this report visit https://www.researchandmarkets.com/r/omefq7

See the rest here:
The Worldwide Quantum Computing Industry will Exceed $7.1 Billion by 2026 - GlobeNewswire

3 tech trends that COVID-19 will accelerate in 2021 – VentureBeat

Spending 2020 under the shadow of a pandemic has affected what we need and expect from technology. For many, COVID-19 accelerated the rate of digital transformation: as employees worked from home, companies needed AI systems that facilitated remote work and the computing power to support them.

The question is, how should companies focus their resources in 2021 to prepare for this changed reality and the new technologies on the horizon? Here are three trends that I predict will see massive attention in 2021 and beyond.

Progress in AI has already reached a point where it can add significant value to practically any business. COVID-19 triggered a massive sense of urgency around digital transformations with the need for remote solutions. According to a report by Boston Consulting Group, more than 80% of companies plan to accelerate their digital transformation, but only 30% of digital transformations have met or exceeded their target value.

Many AI projects are small scale less than a quarter of companies in McKinseys 2020 State of AI reported significant bottom-line impact. This is especially true in industries that have a physical-digital element. For example: There is a great need for remotely operated, autonomous manufacturing facilities, refineries, or even, in the days of COVID-19, office buildings. While the underlying technology is there, achieving scalability remains a concern and digital leaders will have to overcome that barrier in 2021. Scalability barriers include a lack of disciplined approach, enterprise-wide mindset, credible partners, data liquidity, and change management.

Part of the solution here is to create solutions that will be operated by someone who is not necessarily a data scientist, so more people who are domain experts can manage the programs they need. If Tesla invented an autonomous car that only data scientists can drive, whats the point?

Technology needs to empower the end user so they can interact with and manipulate models without having to trudge through the finer points of datasets or code in other words, the AI will do the heavy lifting on the back end, but a user-friendly explanation and UI empowers the end user. For instance, a facilities management executive can manage their global portfolio of buildings from a tablet sitting at a Starbucks. They can have full visibility into operations, occupant experience, and spend, with the ability to intervene in what otherwise would be an autonomous operation.

Deep learning pioneer Dr. Geoffrey Hinton recently told MIT Technology Review that deep learning will be able to do everything i.e. replicate all human intelligence. Deep neural networks have demonstrated extraordinary capabilities to approximate the most relevant subset of mathematical functions and promise to overcome reasoning challenges.

However, I believe there is a step to full autonomy that we must first conquer: what Dr. Manuela Veloso at Carnegie Mellon calls symbiotic autonomy. With symbiotic autonomy, feedback and correction mechanisms are incorporated into the AI such that humans and machines pass information to each other fluidly.

For example, instead of hard feedback (like thumbs up and thumbs down powering your Netflix queue), symbiotic autonomy could look like a discussion with your phones virtual assistant to determine the best route to a destination. Interactions with these forms of AI would be more natural and conversational, with the program able to explain why it recommended or performed certain actions.

With deep learning, neural networks approximate complex mathematical functions with simpler ones, and the ability to consider a growing number of factors and make smarter decisions with fewer computing resources gives them the ability to become autonomous. I anticipate heavy investment in research of these abilities of deep neural networks across the board, from startups to top tech companies to universities.

This step toward fully autonomous solutions will be a critical step towards implementing AI at scale. Imagine an enterprise performance management system that can give you a single pane of visibility and control across a global enterprise that is operating multiple facilities, workers, and supply chains autonomously. It runs and learns on its own but you can intervene and teach when it makes a mistake.

(The question of ethics in autonomous systems will come into play here, but that is a subject for another article.)

Quantum computers have the computational power to handle complex algorithms due to their abilities to process solutions in parallel, rather than sequentially. Lets think of how this could affect development and delivery of vaccines.

First, during drug discovery, researchers must simulate a new molecule. This is tremendously challenging to do with todays high-performance computers, but is a problem that lends itself to something at which quantum computers will eventually excel. The quantum computer could eventually be mapped to the quantum system that is the molecule, and simulate binding energies and chemical transition strengths before anyone ever even had to make a drug.

However, AI and quantum computing have even more to offer beyond creating the vaccine. The logistics of manufacturing and delivering the vaccine are massive computational challenges which of course makes them ripe for a solution that combines quantum computing and AI.

Quantum machine learning is an extremely new field with so much promise, but breakthroughs are needed to make it catch investors attention. Tech visionaries can already start to see how its going to impact our future, especially with respect to understanding nanoparticles, creating new materials through molecular and atomic maps, and glimpsing the deeper makeup of the human body.

The area of growth I am most excited about is the intersection of research in these systems, which I believe will start to combine and produce results more than the sum of their parts. While there have been some connections of AI and quantum computing, or 5G and AI, all of these technologies working together can produce exponential results.

Im particularly excited to see how AI, quantum, and other tech will influence biotechnology as that might be the secret to superhuman capabilities and what could be more exciting than that?

Usman Shuja is General Manager at Honeywell.

Link:
3 tech trends that COVID-19 will accelerate in 2021 - VentureBeat

Securing the DNS in a Post-Quantum World: Hash-Based Signatures and Synthesized Zone Signing Keys – CircleID

This is the fifth in a multi-part series on cryptography and the Domain Name System (DNS).

In my last article, I described efforts underway to standardize new cryptographic algorithms that are designed to be less vulnerable to potential future advances in quantum computing. I also reviewed operational challenges to be considered when adding new algorithms to the DNS Security Extensions (DNSSEC).

In this post, I'll look at hash-based signatures, a family of post-quantum algorithms that could be a good match for DNSSEC from the perspective of infrastructure stability.

I'll also describe Verisign Labs research into a new concept called synthesized zone signing keys that could mitigate the impact of the large signature size for hash-based signatures, while still maintaining this family's protections against quantum computing.

(Caveat: The concepts reviewed in this post are part of Verisign's long-term research program and do not necessarily represent Verisign's plans or positions on new products or services. Concepts developed in our research program may be subject to U.S. and/or international patents and/or patent applications.)

The DNS community's root key signing key (KSK) rollover illustrates how complicated a change to DNSSEC infrastructure can be. Although successfully accomplished, this change was delayed by ICANN to ensure that enough resolvers had the public key required to validate signatures generated with the new root KSK private key.

Now imagine the complications if the DNS community also had to ensure that enough resolvers not only had a new key but also had a brand-new algorithm.

Imagine further what might happen if a weakness in this new algorithm were to be found after it was deployed. While there are procedures for emergency key rollovers, emergency algorithm rollovers would be more complicated, and perhaps controversial as well if a clear successor algorithm were not available.

I'm not suggesting that any of the post-quantum algorithms that might be standardized by NIST will be found to have a weakness. But confidence in cryptographic algorithms can be gained and lost over many years, sometimes decades.

From the perspective of infrastructure stability, therefore, it may make sense for DNSSEC to have a backup post-quantum algorithm built in from the start one for which cryptographers already have significant confidence and experience. This algorithm might not be as efficient as other candidates, but there is less of a chance that it would ever need to be changed. This means that the more efficient candidates could be deployed in DNSSEC with the confidence that they have a stable fallback. It's also important to keep in mind that the prospect of quantum computing is not the only reason system developers need to be considering new algorithms from time to time. As public-key cryptography pioneer Martin Hellman wisely cautioned, new classical (non-quantum) attacks could also emerge, whether or not a quantum computer is realized.

The 1970s were a foundational time for public-key cryptography, producing not only the RSA algorithm and the Diffie-Hellman algorithm (which also provided the basic model for elliptic curve cryptography), but also hash-based signatures, invented in 1979 by another public-key cryptography founder, Ralph Merkle.

Hash-based signatures are interesting because their security depends only on the security of an underlying hash function.

It turns out that hash functions, as a concept, hold up very well against quantum computing advances much better than currently established public-key algorithms do.

This means that Merkle's hash-based signatures, now more than 40 years old, can rightly be considered the oldest post-quantum digital signature algorithm.

If it turns out that an individual hash function doesn't hold up whether against a quantum computer or a classical computer then the hash function itself can be replaced, as cryptographers have been doing for years. That will likely be easier than changing to an entirely different post-quantum algorithm, especially one that involves very different concepts.

The conceptual stability of hash-based signatures is a reason that interoperable specifications are already being developed for variants of Merkle's original algorithm. Two approaches are described in RFC 8391, "XMSS: eXtended Merkle Signature Scheme" and RFC 8554, "Leighton-Micali Hash-Based Signatures." Another approach, SPHINCS+, is an alternate in NIST's post-quantum project.

Figure 1. Conventional DNSSEC signatures. DNS records are signed with the ZSK private key, and are thereby "chained" to the ZSK public key. The digital signatures may be hash-based signatures.

Hash-based signatures can potentially be applied to any part of the DNSSEC trust chain. For example, in Figure 1, the DNS record sets can be signed with a zone signing key (ZSK) that employs a hash-based signature algorithm.

The main challenge with hash-based signatures is that the signature size is large, on the order of tens or even hundreds of thousands of bits. This is perhaps why they haven't seen significant adoption in security protocols over the past four decades.

Verisign Labs has been exploring how to mitigate the size impact of hash-based signatures on DNSSEC, while still basing security on hash functions only in the interest of stable post-quantum protections.

One of the ideas we've come up with uses another of Merkle's foundational contributions: Merkle trees.

Merkle trees authenticate multiple records by hashing them together in a tree structure. The records are the "leaves" of the tree. Pairs of leaves are hashed together to form a branch, then pairs of branches are hashed together to form a larger branch, and so on. The hash of the largest branches is the tree's "root." (This is a data-structure root, unrelated to the DNS root.)

Each individual leaf of a Merkle tree can be authenticated by retracing the "path" from the leaf to the root. The path consists of the hashes of each of the adjacent branches encountered along the way.

Authentication paths can be much shorter than typical hash-based signatures. For instance, with a tree depth of 20 and a 256-bit hash value, the authentication path for a leaf would only be 5,120 bits long, yet a single tree could authenticate more than a million leaves.

Figure 2. DNSSEC signatures following the synthesized ZSK approach proposed here. DNS records are hashed together into a Merkle tree. The root of the Merkle tree is published as the ZSK, and the authentication path through the Merkle tree is the record's signature.

Returning to the example above, suppose that instead of signing each DNS record set with a hash-based signature, each record set were considered a leaf of a Merkle tree. Suppose further that the root of this tree were to be published as the ZSK public key (see Figure 2). The authentication path to the leaf could then serve as the record set's signature.

The validation logic at a resolver would be the same as in ordinary DNSSEC:

The only difference on the resolver's side would be that signature validation would involve retracing the authentication path to the ZSK public key, rather than a conventional signature validation operation.

The ZSK public key produced by the Merkle tree approach would be a "synthesized" public key, in that it is obtained from the records being signed. This is noteworthy from a cryptographer's perspective, because the public key wouldn't have a corresponding private key, yet the DNS records would still, in effect, be "signed by the ZSK!"

In this type of DNSSEC implementation, the Merkle tree approach only applies to the ZSK level. Hash-based signatures would still be applied at the KSK level, although their overhead would now be "amortized" across all records in the zone.

In addition, each new ZSK would need to be signed "on demand," rather than in advance, as in current operational practice.

This leads to tradeoffs, such as how many changes to accumulate before constructing and publishing a new tree. Fewer changes and the tree will be available sooner. More changes and the tree will be larger, so the per-record overhead of the signatures at the KSK level will be lower.

My last few posts have discussed cryptographic techniques that could potentially be applied to the DNS in the long term or that might not even be applied at all. In my next post, I'll return to more conventional subjects, and explain how Verisign sees cryptography fitting into the DNS today, as well as some important non-cryptographic techniques that are part of our vision for a secure, stable and resilient DNS.

Read the previous posts in this six-part blog series:

See the original post here:
Securing the DNS in a Post-Quantum World: Hash-Based Signatures and Synthesized Zone Signing Keys - CircleID