Archive for the ‘Quantum Computer’ Category

Post-Quantum Cryptography: Migrating to Quantum Resistant Cryptography – Trend Micro

By Morton Swimmer, Mark Chimley, Adam Tuaima

In the previous parts of this series, we have learned about cryptography, what makes quantum computers unique, and how quantum computers break this cryptography. In the fourth and final part of our study on post-quantum cryptography, we will look at quantum-resistant algorithms that could replace our existing cryptography.

The algorithmic assumption of most existing public key cryptography in common use is that factoring large integers is hard; because of quantum computers and Shors algorithm, this algorithmic assumption is now a vulnerability. We need new assumptions that are quantum resistant, easy to solve if we know the key, but practically impossible to solve without the key on both classical and quantum computers. In the following sections, we get into commonly discussed schemes to solve these new requirements.

Learning with errors (LWE) is based on the idea of representing secret information as a set of equations and then intentionally introducing errors to hide the real value of the hidden information. There are several variants of this, such as ring learning with errors (RLWE) and module learning with errors (MLWE).

Figure 1. In learning with errors, an input value is littered with noise to create an output value.

We can visualize this as a function that takes input values and outputs the result. The true function, represented by the red line, is secret, and we add noise, represented in orange, to create the output value. For more complicated equations, the error makes it difficult to recover the true function accurately.

This scheme hides information within a mathematical grid, a lattice, that makes it extremely difficult to unlock without the right key. Figure 2 shows a two-dimensional lattice, where vectors b1 and b2 are the basis vectors. The lattice problem requires looking for the shortest vector to get to the vector of origin by combining the two basis vectors. While doable but already difficult in a two-dimensional space, the problem becomes increasingly complex in higher dimensions. This problem is known to be NP-Hard, a term that refers to a class of problems where no known algorithm is efficient enough to solve it in practical time, but if a solution is provided (in the case of our discussion, a basis vector that will act as a key), the problem can be verified in polynomial time.

Figure 2. The lattice problem requires looking for the shortest vector by combining the two basis vectors to get to the vector of origin.

Hash-based cryptography is the generic term for constructions of cryptographic primitives based on the security of hash functions. All the currently used cryptographic hashes are considered quantum resistant if sufficiently large keys are used. Doubling the key size is usually enough, but the larger the key, the more unique a hash value can be to the input data, and the smallest of changes will produce a significantly different hash value. So, we can use hash-based cryptography to construct digital signature schemes such as the Merkle signature scheme, zero-knowledge proofs (ZKP), and computational integrity proofs (CIP). The seminal works by Merkle and Lamport that were developed in the 1970s already explored this, but the idea and concept was rediscovered and further refined with the search for post-quantum cryptography.

The code-based approach to cryptography is derived from correcting transmission errors when communicating over a noisy channel. Instead of an unintentionally noisy channel, we intentionally introduce random noise into the stream of cyphertext containing hidden error correcting codes. Without knowing the error correction scheme, it is very difficult to decode the message, and this is referred to the decoding problem. While code-based approaches have been around since at least 1978 with Robert McElieces research, this problem is also known to be NP-Hard. There is also no known quantum solution to solve this problem in practical time.

Elliptic-curve cryptography (ECC) is a type of encryption that uses the points in an elliptic curve to scramble a message and secure data. ECC itself is also broken by quantum computing, by way of connecting two different but related elliptic curves, or isogenous curves.

Figure 3. Isogeny-based cryptography requires looking for the rational map between two elliptic curves that maintains their mathematic properties to allow for both curves to communicate and share information securely.

Isogeny-based encryption is a variant of ECC that uses this concept; it relies on it being easy to come up with a set of isogenies and elliptic curves, but very hard to find the isogenies given only the elliptic curves. Secret keys are derived from a randomly chosen isogeny and the public key is derived from the target of these isogenies. The keys used in this encryption are about the same size as those used in current cryptography, which makes it an attractive option for securing data. Finding the isogenies used to compute points on two known target curves is akin to computing the discrete logs of values in a Diffie-Hellman (DH) key exchange protocol. There is no relation between the problem of finding the isogenies and problems with known good quantum computing solutions and are therefore considered quantum resistant.Its important to note that this description of isogeny-based encryption is simplified for brevity; further explanation will require a deeper and lengthier diver into elliptic curve cryptography.

Quantum computers with a large enough capacity to break our current cryptography do not exist yet, so why should we care?

There is a scenario where an attacker can capture all the internet traffic from a connection, including the key-establishment and the symmetrically encrypted data stream, and when quantum computers of sufficient size exist in the future, the attacker will be able to find the shared key within the captured internet traffic and decrypt the data stream with it. We call this attack on forward-secrecy a harvest attack. As such, there is therefore a need for quantum resistant key establishment and digital signing schemes

The National Institute of Standards and Technology (NIST) initiated a selection process for post-quantum cryptography in 2017 with nearly 70 candidates that have been narrowed down to 15 candidates after the third round of evaluations was completed in 2022. Eight of the 15 candidates are still under evaluation, while seven candidates are already slated for standardization.

Equally important, in 2021 the Internet Engineering Task Force (IETF) discussed a hybrid approach to post-quantum cryptography in TLS 1.2, a prudent method to mitigate any unknown weaknesses in any single algorithm. This idea has been echoed by the German BSI (Federal Office for Information Security) and French ANSSI (Cybersecurity Agency).

Key exchange cryptography is also known as key establishment methodology or key encapsulation mechanism (KEM), where a key is agreed on that can be used for symmetric encryption between two parties. It is a subset of what public key cryptography can do and we currently use algorithms like Diffie-Hellman or RSA (RivestShamirAdleman) for this.

The current selection of KEM finalists in the NIST post-quantum cryptography (PQC) standardization are CRYSTALS-Kyber, NTRU, and Saber (lattice-types), and Classic McEliece (code-based). However, only CRYSTALS-Kyber was selected for standardization in the third round.

CRYSTALS-Kyber is now considered a primary algorithm and has been given the Federal Information Processing Standards (FIPS) number 203. This lattice-based scheme has a LWE module for key generation. Its origins date back to 2005 but the NIST submission was based on a paper released in 2018, making it a young algorithm by cryptography standards. The performance is relatively good, but like the other structured lattice KEMs, Kybers public key is in the order of a thousand bytes still small compared to other candidates, with bandwidth requirements also on the lower side. Kyberhas comparatively fast key generation, encapsulation and decapsulation in software implementations and there are already some optimizations for various software and hardware solutions. It can be implemented on field-programmable gate arrays (FPGA) effectively. Kybers authors put effort into showcasing its security, so it is considered mature despite the relatively new approach. Earlier this year, however, Yilei Chen published algorithms for finding solutions to the lattice problem on a quantum computer. While this is a step in the direction of breaking lattice-based schemes such as Kyber, the inventors have built in various safeguards. For example, Kyber combined LWE with the lattice-based scheme to keep it safe by our current state of knowledge; but it should be noted that combining schemes is still a new field of cryptography.

Classic McEliece was proposed in 2017 but based on the original McEliece algorithm from 1978. This makes it different from many of the other proposals that are LWE and lattice-based. It remains to be seen if it eventually becomes a standard.Classic McEliece, along with other code-based BIKE and HQC, have advanced to the fourth round for further consideration.

Isogeny-based SIKE also initially advanced to the next round as an alternate candidate, but was then retracted when an attack against it was found. There are currently no further isogeny-based schemes in the race, but research continues.

Lattice-type FrodoKEM and NTRU Prime were designated alternatives but will not be standardized. Even though FrodoKEM is now out of the NIST-PQC race, the German BSI and the French ANSSI still consider it a good candidate. There have been appeals to reinstate its consideration in NISTs fourth round.

Meanwhile, the digital signature schemes are referred to as SIG in the NIST standardization process, where CRYSTALSDilithium, FALCON, and SPHINCS+ have been selected for standardization.

CRYSTALS-Dilithium is a lattice-based scheme based on the Fiat-Shamir paradigm to keep key sizes manageable. It uses short keys and signatures likeFALCON but does not require floating point operations. This makes it easier to implement on various CPUs and makes it efficient; NIST strongly recommends Dilithium for implementors. Dilithium is being standardized as FIPS 204.

FALCON is also a lattice-based scheme but is based on the "hash and sign" paradigm. It requires floating point operations and has a complex internal data structure, making it difficult to implement on a variety of CPUs. However, FALCON requires the smallest bandwidth of the candidate schemes and is faster when verifying a signature, but its slower than Dilithium when signing. It was chosen as an alternative when lower bandwidth and higher security is required, but NIST has yet to assign it a FIPS number.

SPHINCS+ is based on the combination of various hash-based schemes that can perform key generation and validation quickly, creating short public keys, but it can be slow when signing, creating long signatures. SPHINCS+ could be tricky to implement and perform cryptanalysis on. Two attacks with specific parameters on SPHINCS+ were discovered in the third round of evaluation, making it a surprising choice for standardization, with the FIPS number 205. We assume they did this to have an alternative to the other two lattice schemes. In this case, it should be noted that care should be taken to avoid insecure parameter sets when using the algorithms.

The Internet Engineering Task Force (IETF) has proposed their own SIG algorithms: XMSS (RFC 8391) and LMS (RFC 8708). LMS is within the IETF standards, while XMSS is created for information only. These will likely remain niche algorithms, at best, in favour of the NIST and FIPS standards.

An alternative to KEM is Quantum Key Exchange (QKE), which is a secure key exchange algorithm that uses the entanglement of particles to establish a shared secret that, if snooped on, will be invalidated. The invalidation happens because the attacker needs to measure the quantum state of the entangled particles, which will result in the collapse of its superposition. This algorithm is called BB84, after the authors Charles Bennett and Gilles Brassard who developed it in 1984.

QKE requires special hardware and a dedicated transmission medium, usually fiber optic cables, but a satellite link has also been successfully tested. Having been tested since the 2000s, the distance that keys can be transmitted has improved from tens of kilometers to around a thousand kilometers.

The technology has already been used in practice: in 2023, Toshiba, the BT Group (formerly British Telecom) and HSBC bank (Hongkong and Shanghai Banking Corporation Limited) tested a secure link in London where symmetric keys were exchanged using QKE, and the data is encrypted over any standard communication link using a symmetric cipher.

In the foreseeable future, this will not be the technology that secures our everyday web server connections but can secure connections between large hubs of computing such as data centers, and it could also connect a data center to an office complex. The current requirement for specialized equipment and a dedicated fiber optic cable makes personal use impossible for the moment.This technology is related to the quantum internet, but to avoid confusion, we refer to it as Quantum Key Distribution Network.

The Open Quantum Safe (OQS) project has taken all finalists and a few alternate candidates and added them to liboqs, an open-source C library for quantum-safe cryptographic algorithms implemented with various parameters for various CPUs. Support for a variety of CPUs beyond the x86 architecture is improving but many smaller IoT processors still have limited support, mostly because of memory limitations.

Open Quantum Safe has also forked the Open Secure Sockets Layer (OpenSSL) and Secure Shell (SSH) projects and added their respective libraries. This will be useful for testing both compatibility as well as performance, but liboqs is not yet recommended for production use.

Various commercial vendors of cryptography have started implementing PQC. These can be tracked at the Public Key Infrastructure (PKI) Consortiums website. Both Cloudflare and Amazon Web Services (AWS) claim to have implemented the finalists algorithms. The communications app, Signal, has started using CRYSTALS-Kyber for their message security.Likewise, Apple has also started using Kyber in their iMessage communications app; but they opted to call it PQ3.

In the last decade, we have gone from encrypted connections being optional to being nearly ubiquitous. Ten years ago, changing a cryptographic algorithm would have been comparatively simple compared to how it is today. Now, browsers use Transport Layer Security (TLS) with either an HTTPS or QUIC connection whenever possible, cloud data is encrypted at rest and in transport, and digital signing of software is the norm. Changing our cryptography now will be a huge task.

The threat landscape now also contains many state actors who may be able and willing to harvest network traffic to decrypt it later when quantum computers become available. This determines whether you decide to require forward-secrecy, such as requiring that encrypted messages stay unreadable for unauthorized viewers for at least five years.

The good news is that many individuals and companies may not have to do anything. Today, some products are starting to integrate PQC algorithms. Google recently integrated CRYSTALS-Kyber into version 116 of Chromium, which means that Google Chrome and other browsers based on Chromium will have PQC support. On the server side, some cloud providers like Google, Amazon and Cloudflare have enabled the same algorithm on the server so that this traffic will be forward-secret in the future.

For instance, Figure 3 shows what Cloudflares test page for PQC negotiation looked like before the upgrade of Chrome, while Figure 4 shows what it looks like after the upgrade and manually activating the Kyber (called X25519Kyber768Draft00) algorithm.

Figure 4. Cloudflares test page for PQC negotiation before Chromes upgrade

Figure 5. Cloudflares test page for PQC negotiation after the upgrade and manually activating the Kyber algorithm

Vendors will be doing many of the migrations to PQC for us using nascent libraries or their own implementations; however, enterprises should understand their exposure and threat model to make sure that some legacy applications do not prevent the upgrade. A mature organization should have a plan for upgrading their cryptography, regardless of the reasons.

Ultimately, this is a business strategy decision, and it depends on a companys threat model. If forward-secrecy is vital to the organization, early adoption will be vital. If not, enterprises can opt to wait for vendors to implement PQC and roll it out within their systems gradually. An informed decision involving business leadership will ensure its success.

The first stage in upgrading to new algorithms is to identify cryptographic algorithm use. For the purposes of upgrading to post-quantum cryptography, we must first look at existing public key cryptography. We need to know what self-managed cryptography is used, by whom, where, and for what assets. We must also identify the respective value of the assets being encrypted.

Next, we need to know how systems are communicating with each other: If a server supports PQC but a client does not, an enterprise cannot enforce PQC. While the largest cloud providers are being proactive, some services may not get updated, which needs to be determined.

We know that all PQC algorithms have larger key sizes and requires more CPU and memory capacity to compute compared to current algorithms. The performance impact is of concern on servers that need to establish numerous connections. The server hardware will have to be scaled up, or cryptographic co-processors may be needed if these exist for PQC by then. Some cloud providers offload the TLS connection to specialized servers, which could also alleviate the problem.

Enterprises need to understand the risk to their respective businesses. Some questions to consider are:

These evaluations should be collected in a report and used as a reference to help identify further priority actions.

Before execution, the upgrade plan needs to be tested to identify and negotiate which algorithm will be used by each library, as well as enabling and monitoring the logs to make sure a PQC algorithm is used when it is expected. Stress testing connections to see if the larger keys and lower performance has negative impacts is another important test.

NIST will most likely standardize the PQC algorithms in 2024 or 2025. The German BSI, French ANSSI and Japanese CRYPTREC are likely to adopt the same guidelines. Until that happens, we dont expect any of the PQC algorithms to be integrated into mainstream cryptographic libraries, but a good time to start planning for post-quantum cryptography is still now.

We cannot say for certain when large enough quantum computers will exist, or even if that goal is attainable within the next few generations. By our estimate, it looks likely that well see smaller, still useful machines, in a few years. Larger, more universal machines may follow in a decade. While that sounds like a long way away, changing cryptography in an enterprise is no small feat and should be tackled as early as possible. Furthermore, the benefits are already tangible if we better understand our cryptographic stance, especially considering that so much of security rides on good cryptography.

HIDE

Like it? Add this infographic to your site: 1. Click on the box below. 2. Press Ctrl+A to select all. 3. Press Ctrl+C to copy. 4. Paste the code into your page (Ctrl+V).

Image will appear the same size as you see above.

See the original post:
Post-Quantum Cryptography: Migrating to Quantum Resistant Cryptography - Trend Micro

3 Quantum Computing Stocks That Still Have Sky-High Potential – InvestorPlace

Quantum computing will be a game-changer and could create big opportunities for some of the top quantum computing stocks.

In fact, according to McKinsey, it could take computing and the ability to solve complex problems quickly to a whole new level. They also believe it could create a $1.3 trillion opportunity by the time 2035 rolls around.

Quantum computing is a huge leap forward because complex problems that currently take the most powerful supercomputer several years could potentially be solved in seconds, said Charlie Campbell forTime.This could open hitherto unfathomable frontiers in mathematics and science, helping to solve existential challenges like climate change and food security.

It couldhelp speed up new drug treatment discoveries. It may even help speed up financing and data speed, assist with climate change issues, cybersecurity and other mind-boggling complex issues faster than a regular computer.

Even more impressive, the technology is already beingreferred to as a revolution for humanity bigger than fire, bigger than the wheel, according toHaim Israel, Head of Thematic Investing Research atBank of America.

All of which could fuel big upside for quantum computing stocks.

Source: Amin Van / Shutterstock.com

Consolidating at $7.87, Id like to see shares of IonQ (NYSE:IONQ) initially run back to $10 a share.

For one, earnings have been ok.The company just posted a first-quarter loss of 19 cents, which beat expectations by six cents. Revenue of $7.6 million up 77.2% year over year beat by $600,000. Also, for the full year, revenue is expected to be between $37 million and $41 million, with estimates calling for $39.99 million.

Two, the company is quickly gaining more U.S. defense, technology and university clients. It also expects to increase its computing power from AQ 36 (a tool used to show how useful a quantum computer is at solving real problems) to AQ 64 by 2025.

Three, whats really enticing about IONQ is that were still in the early stages of growth. When quantum computing does become far bigger than it is now, it could propel this $1.74 billion company to higher highs.

Source: Bartlomiej K. Wroblewski / Shutterstock.com

Another one of the top quantum computing stocks to buy isD-Wave Quantum(NYSE:QBTS), which claims to be the worlds first commercial supplier of quantum computers.

At the moment, QBTS is sitting at double-bottom support at $1.16. From here, Id like to see it initially run to about $1.70. Then, once the quantum computing story really starts to heat up, Id like to see the stock run back to $3.20 from its current price.

Helping, QBTS has a consensus strong buy rating from four analysts, with an average price target of $3. And, the stock is set to join theRussell 3000 Index on July 1, which will give it even more exposure to investors. In addition, not long ago, analysts at Needham initiated coverage of QBTS with a buy rating and a price target of $2.50.

Even better, the company just extended its partnership with Aramco to help solve complex geophysical optimization issues with quantum technologies. All of which should draw in a good number of eyeballs to the QBTS stock.

Source: Boykov / Shutterstock.com

One of the best ways to diversify your portfolio and spend less is with an exchange-traded fund (ETF) like the Defiance Quantum ETF(NYSEARCA:QTUM).

For one, with an expense ratio of 0.40%, the QTUM ETF provides exposure to companies on the forefront of machine learning, quantum computing, cloud computing and other transformative computing technologies,according to Defiance ETFs.

Two, some of 71 holdings include MicroStrategy (NASDAQ:MSTR), Nvidia (NASDAQ:NVDA), Micron(NASDAQ:MU), Coherent (NYSE:COHR), Applied Materials(NASDAQ:AMAT) and Rigetti Computing (NASDAQ:RGTI).

Even better, I can gain access to all 71 names for less than $65 with the ETF.

Three, the ETF has been explosive. Since bottoming out around $55, its now up to $63.37. From that current price, Id like to see the QTUM ETF race to $70 a share, near term.

On the date of publication, Ian Cooper did not hold (either directly or indirectly) any positions in the securities mentioned. The opinions expressed in this article are those of the writer, subject to the InvestorPlace.comPublishing Guidelines.

Ian Cooper, a contributor to InvestorPlace.com, has been analyzing stocks and options for web-based advisories since 1999.

The rest is here:
3 Quantum Computing Stocks That Still Have Sky-High Potential - InvestorPlace

Quantum computers are like kaleidoscopes why unusual metaphors help illustrate science and technology – The Conversation Indonesia

Quantum computing is like Forrest Gumps box of chocolates: You never know what youre gonna get. Quantum phenomena the behavior of matter and energy at the atomic and subatomic levels are not definite, one thing or another. They are opaque clouds of possibility or, more precisely, probabilities. When someone observes a quantum system, it loses its quantum-ness and collapses into a definite state.

Quantum phenomena are mysterious and often counterintuitive. This makes quantum computing difficult to understand. People naturally reach for the familiar to attempt to explain the unfamiliar, and for quantum computing this usually means using traditional binary computing as a metaphor. But explaining quantum computing this way leads to major conceptual confusion, because at a base level the two are entirely different animals.

This problem highlights the often mistaken belief that common metaphors are more useful than exotic ones when explaining new technologies. Sometimes the opposite approach is more useful. The freshness of the metaphor should match the novelty of the discovery.

The uniqueness of quantum computers calls for an unusual metaphor. As a communications researcher who studies technology, I believe that quantum computers can be better understood as kaleidoscopes.

The gap between understanding classical and quantum computers is a wide chasm. Classical computers store and process information via transistors, which are electronic devices that take binary, deterministic states: one or zero, yes or no. Quantum computers, in contrast, handle information probabilistically at the atomic and subatomic levels.

Classical computers use the flow of electricity to sequentially open and close gates to record or manipulate information. Information flows through circuits, triggering actions through a series of switches that record information as ones and zeros. Using binary math, bits are the foundation of all things digital, from the apps on your phone to the account records at your bank and the Wi-Fi signals bouncing around your home.

In contrast, quantum computers use changes in the quantum states of atoms, ions, electrons or photons. Quantum computers link, or entangle, multiple quantum particles so that changes to one affect all the others. They then introduce interference patterns, like multiple stones tossed into a pond at the same time. Some waves combine to create higher peaks, while some waves and troughs combine to cancel each other out. Carefully calibrated interference patterns guide the quantum computer toward the solution of a problem.

The term bit is a metaphor. The word suggests that during calculations, a computer can break up large values into tiny ones bits of information which electronic devices such as transistors can more easily process.

Using metaphors like this has a cost, though. They are not perfect. Metaphors are incomplete comparisons that transfer knowledge from something people know well to something they are working to understand. The bit metaphor ignores that the binary method does not deal with many types of different bits at once, as common sense might suggest. Instead, all bits are the same.

The smallest unit of a quantum computer is called the quantum bit, or qubit. But transferring the bit metaphor to quantum computing is even less adequate than using it for classical computing. Transferring a metaphor from one use to another blunts its effect.

The prevalent explanation of quantum computing is that while classical computers can store or process only a zero or one in a transistor or other computational unit, quantum computers supposedly store and handle both zero and one and other values in between at the same time through the process of superposition.

Superposition, however, does not store one or zero or any other number simultaneously. There is only an expectation that the values might be zero or one at the end of the computation. This quantum probability is the polar opposite of the binary method of storing information.

Driven by quantum sciences uncertainty principle, the probability that a qubit stores a one or zero is like Schroedingers cat, which can be either dead or alive, depending on when you observe it. But the two different values do not exist simultaneously during superposition. They exist only as probabilities, and an observer cannot determine when or how frequently those values existed before the observation ended the superposition.

Leaving behind these challenges to using traditional binary computing metaphors means embracing new metaphors to explain quantum computing.

The kaleidoscope metaphor is particularly apt to explain quantum processes. Kaleidoscopes can create infinitely diverse yet orderly patterns using a limited number of colored glass beads, mirror-dividing walls and light. Rotating the kaleidoscope enhances the effect, generating an infinitely variable spectacle of fleeting colors and shapes.

The shapes not only change but cant be reversed. If you turn the kaleidoscope in the opposite direction, the imagery will generally remain the same, but the exact composition of each shape or even their structures will vary as the beads randomly mingle with each other. In other words, while the beads, light and mirrors could replicate some patterns shown before, these are never absolutely the same.

Using the kaleidoscope metaphor, the solution a quantum computer provides the final pattern depends on when you stop the computing process. Quantum computing isnt about guessing the state of any given particle but using mathematical models of how the interaction among many particles in various states creates patterns, called quantum correlations.

Each final pattern is the answer to a problem posed to the quantum computer, and what you get in a quantum computing operation is a probability that a certain configuration will result.

Metaphors make the unknown manageable, approachable and discoverable. Approximating the meaning of a surprising object or phenomenon by extending an existing metaphor is a method that is as old as calling the edge of an ax its bit and its flat end its butt. The two metaphors take something we understand from everyday life very well, applying it to a technology that needs a specialized explanation of what it does. Calling the cutting edge of an ax a bit suggestively indicates what it does, adding the nuance that it changes the object it is applied to. When an ax shapes or splits a piece of wood, it takes a bite from it.

Metaphors, however, do much more than provide convenient labels and explanations of new processes. The words people use to describe new concepts change over time, expanding and taking on a life of their own.

When encountering dramatically different ideas, technologies or scientific phenomena, its important to use fresh and striking terms as windows to open the mind and increase understanding. Scientists and engineers seeking to explain new concepts would do well to seek out originality and master metaphors in other words, to think about words the way poets do.

Link:
Quantum computers are like kaleidoscopes why unusual metaphors help illustrate science and technology - The Conversation Indonesia

Unlock Generous Growth With These 3 Top Quantum Computing Stocks – InvestorPlace

While the technology offers myriad innovations, investors ought to earmark the top quantum computing stocks for the speculative long-term section of their portfolio. Fundamentally, it all comes down to the projected relevance.

According to Grand View Research, the global quantum computing market size reached a valuation of $1.05 billion in 2022. Experts project that the sector could expand at a compound annual growth rate (CAGR) of 19.6% from 2023 to 2030. At the culmination of the forecast period, the segment could print revenue of $4.24 billion.

Better yet, we might be in the early stages. Per McKinsey & Company, quantum technology itself could lead to value creation worth trillions of dollars. Essentially, quantum computers represent a paradigm shift from the classical approach. These devices can generate myriad functions simultaneously, leading to explosive growth in productivity.

Granted, with every pioneering space comes high risks. If youre willing to accept the heat, these are the top quantum computing stocks to consider.

Source: Boykov / Shutterstock.com

To be sure, Honeywell (NASDAQ:HON) isnt exactly what you would call a direct player among top quantum computing stocks. Rather, the company is an industrial and applied sciences conglomerate, featuring acumen across myriad disciplines. However, Honeywell is very much relevant to the advanced computing world thanks to its investment in Quantinuum.

Earlier this year, Honeywells quantum computing enterprise reached a valuation of $5 billion following a $300 million equity funding round, per Reuters. Notably, JPMorgan Chase (NYSE:JPM) helped anchor the investment. According to the news agency, [c]ompanies are exploring ways to develop and scale quantum capabilities to solve complex problems such as designing and manufacturing hydrogen cell batteries for transportation.

Honeywell could play a big role in the applied capabilities of quantum computing, making it a worthwhile long-term investment. To be fair, its not the most exciting play in the world. Analysts rate shares a consensus moderate buy but with an average price target of $229.21. That implies about 10% upside.

Still, Honeywell isnt likely to implode either. As you build your portfolio of top quantum computing stocks, it may pay to have a reliable anchor like HON.

Source: Amin Van / Shutterstock.com

Getting into the more exciting plays among top quantum computing stocks, we have IonQ (NYSE:IONQ). Based in College Park, Maryland, IonQ mainly falls under the computer hardware space. Per its public profile, the company engages in the development of general-purpose quantum computing systems. Business-wise, IonQ sells access to quantum computers of various qubit capacities.

Analysts are quite optimistic about IONQ stock, rating shares a consensus strong buy. Further, the average price target comes in at $16.63, implying over 109% upside potential. Thats not all the most optimistic target calls for a price per share of $21. If so, we would be talking about a return of over 164%. Of course, with a relatively modest market capitalization of $1.68 billion, IONQ is a high-risk entity.

Even with the concerns, including an expansion of red ink for fiscal 2024, covering experts believe the growth narrative could overcome the anxieties. In particular, theyre targeting revenue of $39.47 million, implying 79.1% upside from last years print of $22.04 million. Whats more, fiscal 2025 sales could see a gargantuan leap to $82.38 million. Its one of the top quantum computing stocks to keep on your radar.

Source: Shutterstock

Headquartered in Berkeley, California, Rigetti Computing (NASDAQ:RGTI) through its subsidiaries builds quantum computers and superconducting quantum processors. In particular, Rigetti offers a cloud-based solution under a quantum processing umbrella. It also sells access to its groundbreaking computers through a business model called Quantum Computing as a Service.

While intriguing, RGTI stock is high risk. The reality is that the enterprise features a market cap of a little over $175 million. That translates to a per-share price of two pennies over a buck. With such a diminutive profile, anything can happen. Still, its tempting because analysts rate shares a unanimous strong buy. Also, the average price target lands at $3, implying over 194% upside potential.

Whats even more enticing are the financial projections. Covering experts believe that Rigetti will post a loss per share of 41 cents. Thats an improvement over last years loss of 57 cents. Further, revenue could hit $15.3 million, up 27.4% from the prior year. And in fiscal 2025, sales could soar to $28.89 million, up nearly 89% from projected 2024 revenue.

If you can handle the heat, RGTI is one of the top quantum computing stocks to consider.

On the date of publication, Josh Enomoto did not have (either directly or indirectly) any positions in the securities mentioned in this article.The opinions expressed in this article are those of the writer, subject to the InvestorPlace.comPublishing Guidelines.

A former senior business analyst for Sony Electronics, Josh Enomoto has helped broker major contracts with Fortune Global 500 companies. Over the past several years, he has delivered unique, critical insights for the investment markets, as well as various other industries including legal, construction management, and healthcare. Tweet him at @EnomotoMedia.

Read the original:
Unlock Generous Growth With These 3 Top Quantum Computing Stocks - InvestorPlace

Quantum control’s role in scaling quantum computing – McKinsey

June 14, 2024by Henning Soller and Niko Mohr with Elisa Becker-Foss, Kamalika Dutta, Martina Gschwendtner, Mena Issler, and Ming Xu

Quantum computing can leverage the states of entangled qubits1 to solve problems that classical computing cannot currently solve and to substantially improve existing solutions. These qubits, which are typically constructed from photons, atoms, or ions, can only be manipulated using specially engineered signals with precisely controlled energy that is barely above that of a vacuum and that changes within nanoseconds. This control system for qubits, referred to as quantum control, is a critical enabler of quantum computing because it ensures quantum algorithms perform with optimal efficiency and effectiveness.

While the performance and scaling limitations of current quantum control systems preclude large-scale quantum computing, several promising technological innovations may soon offer scalable control solutions.

A modern quantum computer comprises various hardware and software components, including quantum control components that require extensive space and span meters. In quantum systems, qubits interact with the environment, causing decoherence and decay of the encoded quantum information. Quantum gates (building blocks of quantum circuits) cannot be implemented perfectly at the physical system level, resulting in accumulated noise. Noise leads to decoherence, which lowers qubits superposition and entanglement properties. Quantum control minimizes the quantum noisefor example, thermal fluctuations and electromagnetic interferencecaused by the interaction between the quantum hardware and its surroundings. Quantum control also addresses noise by improving the physical isolation of qubits, using precise control techniques, and implementing quantum error correction codes. Control electronics use signals from the classical world to provide instructions for qubits, while readout electronics measure qubit states and transmit that information back to the classical world. Thus, the control layer in a quantum technology stack is often referred to as the interface between the quantum and classical worlds.

Components of the control layer include the following:

A superconducting- or spin qubitbased computer, for example, includes physical components such as quantum chips, cryogenics (cooling electronics), and control and readout electronics.

Quantum computing requires precise control of qubits and manipulation of physical systems. This control is achieved via signals generated by microwaves, lasers, and optical fields or other techniques that support the underlying qubit type. A tailored quantum control system is needed to achieve optimal algorithm performance.

In the context of a quantum computing stack, control typically refers to the hardware and software system that connects to the qubits the application software uses to solve real-world problems such as optimization and simulation (Exhibit 1).

At the top of the stack, software layers translate real-world problems into executable instructions for manipulating qubits. The software layer typically includes middleware (such as a quantum transpiler2) and control software comprising low-level system software that provides compilation, instrument control, signal generation, qubit calibration, and dynamical error suppression.3 Below the software layer is the hardware layer, where high-speed electronics and physical components work together to send signals to and read signals from qubits and to protect qubits from noise. This is the layer where quantum control instructions are executed.

Quantum control hardware systems are highly specialized to accommodate the intricacies of qubits. Control hardware interfaces directly with qubits, generating and reading out extremely weak and rapidly changing electromagnetic signals that interact with qubits. To keep qubits functioning for as long as possible, control hardware systems must be capable of adapting in real time to stabilize the qubit state (feedback calibration) and correct qubits from decaying to a completely decoherent state4 (quantum error correction).

Although all based on similar fundamental principles of quantum control, quantum control hardware can differ widely depending on the qubit technology with which it is designed to be used (Exhibit 2).

For example, photonic qubits operate at optical frequencies (similar to fiber internet), while superconducting qubits operate at microwave frequencies (similar to a fifth-generation network). Different types of hardware using laser technology or electronic circuits are needed to generate, manipulate, and transmit signals to and from these different qubit types. Additional hardware may be needed to provide environmental control. Cryostats, for example, cool superconducting qubits to keep them in a working state, and ion trap devices are used in trapped-ion qubit systems to confine ions using electromagnetic fields.

Quantum control is critical to enable fault-tolerant quantum computingquantum computing in which as many errors as possible are prevented or suppressed. But realizing this capability on a large scale will require substantial innovation. Existing control systems are designed for a small number of qubits (1 to 1,000) and rely on customized calibration and dedicated resources for each qubit. A fault-tolerant quantum computer, on the other hand, needs to control 100,000 to 1,000,000 qubits simultaneously. Consequently, a transformative approach to quantum control design is essential.

Specifically, to achieve fault-tolerant quantum computing on a large scale, there must be advances to address issues with current state-of-the-art quantum control system performance and scalability, as detailed below.

Equipping quantum systems to perform at large scales will require the following:

The limitations that physical space poses and the cost to power current quantum computing systems restrict the number of qubits that can be controlled with existing architecture, thus hindering large-scale computing.

Challenges to overcoming these restrictions include the following:

Several technologies show promise for scaling quantum control, although many are still in early-research or prototyping stages (Exhibit 3).

Multiplexing could help reduce costs and prevent overheating. The cryogenic complementary metal-oxide-semiconductor (cryo-CMOS) approach also helps mitigate overheating; it is the most widely used approach across industries because it is currently the most straightforward way to add control lines, and it works well in a small-scale R&D setup. However, cryo-CMOS is close to reaching the maximum number of control lines, creating form factor and efficiency challenges to scaling. Even with improvements, the number of control lines would only be reduced by a few orders of magnitude, which is not sufficient for scaling to millions of qubits. Another option to address overheating is single-flux quantum technology, while optical links for microwave qubits can increase efficiency in interconnections as well as connect qubits between cryostats.

Whether weighing options to supply quantum controls solutions or to invest in or integrate quantum technologies into companies in other sectors, leaders can better position their organizations for success by starting with a well-informed and strategically focused plan.

The first strategic decision leaders in the quantum control sector must make is whether to buy or build their solutions. While various levels of quantum control solutions can be sourced from vendors, few companies specialize in control, and full-stack solutions for quantum computing are largely unavailable. The prevailing expertise is that vendors can offer considerable advantages in jump-starting quantum computing operations, especially those with complex and large-scale systems. Nevertheless, a lack of industrial standardization means that switching between quantum control vendors could result in additional costs down the road. Consequently, many leading quantum computing players opt to build their own quantum control.

Ideally, business leaders also determine early on which parts of the quantum tech stack to focus their research capacities on and how to benchmark their technology. To develop capabilities and excel in quantum control, it is important to establish KPIs that are tailored to measure how effectively quantum control systems perform to achieve specific goals, such as improved qubit fidelity.5 This allows for the continuous optimization and refinement of quantum control techniques to improve overall system performance and scalability.

Quantum control is key to creating business value. Thus, the maturity and scalability of control solutions are the chief considerations for leaders exploring business development related to quantum computing, quantum solutions integration, and quantum technologies investment. In addition to scalability (the key criterion for control solutions), leaders will need to consider and address the other control technology challenges noted previously. And as control technologies mature from innovations to large-scale solutions, establishing metrics for benchmarking them will be essential to assess, for example, ease of integration, cost effectiveness, error-suppression effectiveness, software offerings, and the possibility of standardizing across qubit technologies.

Finally, given the shortage of quantum talent, recruiting and developing the highly specialized capabilities needed for each layer of the quantum stack is a top priority to ensure quantum control systems are properly developed and maintained.

Henning Soller is a partner in McKinseys Frankfurt office, and Niko Mohr is a partner in the Dsseldorf office. Elisa Becker-Foss is a consultant in the New York office, Kamalika Dutta is a consultant in the Berlin office, Martina Gschwendtner is a consultant in the Munich office, Mena Issler is an associate partner in the Bay Area office, and Ming Xu is a consultant in the Stamford office.

1 Entangled qubits are qubits that remain in a correlated state in which changes to one affect the other, even if they are separated by long distances. This property can enable massive performance boosts in information processing. 2 A quantum transpiler converts code from one quantum language to another while preserving and optimizing functionality to make algorithms and circuits portable between systems and devices. 3 Dynamical error suppression is one approach to suppressing quantum error and involves the periodic application of control pulse sequences to negate noise. 4 A qubit in a decoherent state is losing encoded quantum information (superposition and entanglement properties). 5 Qubit fidelity is a measure of the accuracy of a qubits state or the difference between its current state and the desired state.

See the original post:
Quantum control's role in scaling quantum computing - McKinsey