Archive for the ‘Quantum Computer’ Category

The 2025 Millionaire’s Club: 3 Quantum Computing Stocks to Buy Now – InvestorPlace

The current rage is about artificial intelligence, but the advancement of the AI field relies on a few key elements including quantum computing

Source: Faces Portrait / Shutterstock.com

If youre on the hunt for quantum computing stocks to buy, youre in the right place. For the past several years, artificial intelligence (AI) has taken the front stage. Notjust in the techfield,but also in the stock market.AI advancement has been tremendous, allowingbusinesses, bothlarge and small,to automate some of their processes.Some of the largest companies have ramped uptheirinvesting in AI teams and divisions, amounting to billions of dollars in additional capital justto keep up with others in the field.

However,there is a newcomer to the field that is independent of AI but will complement it in the future, which is quantum computing. But what is it exactly?Quantum computing uses specialized algorithms and hardware while using quantum mechanics to solve complex problems that typical computerswill takeeithertoo long to solveorcannot solve entirely.Although the world of quantum computing and AI is incredibly complex, we have simplified investing in the field by narrowing it down to 3 companiesthat areat the forefront of the industry, while still being diversified.

Source: Piotr Swat / Shutterstock.com

Nvidia (NASDAQ:NVDA) is an American-based and international leader in the designing and manufacturing of graphics processing units. Although the companys main focus currently is on the AI market, it also has a division that focuses on the quantum computing industry. The stock currently trades at about $924, with a price target of $1,100. This new price target is almost $200 more than the current trading price of a stock, signifying a significant upside potential for Nvidia.

The company accelerates quantum computing centers around the world with its proprietary CUDA-Q platform. The platform also ties quantum computing into AI, allowing the system to solve new and countless problems much faster than before.

The stock currently trades at 36.53x forward earnings. This is about 20% lower in comparison to the stocks own five-year average forward price to earnings (P/E) ratio of 46.14x. Thus, considering what the stock usually trades for, it might be relatively undervalued and at a great point to scoop up some shares.

The company that goes hand in hand with the internet is Alphabet (NASDAQ:GOOG, NASDAQ:GOOGL). The American company first started as a search engine in the late 90s with its main goal of creating the perfect search engine. Fast forward 25 years and you now have a multi-trillion-dollar international company with departments in tech, consumer electronics, data, AI, e-commerce and quantum computing. The companys stock currently trades at about $177 but is on track to rise to an average of $195, with a high of $225 in the next 12 months.

In recent years, it has set out to build the best quantum computing for otherwise impossible problems with the introduction of XPRIZE Quantum Applications and Quantum AI. The program is designed to advance the field of algorithms relating to quantum computing with real-world applications.

As such, the company is in a quickly growing phase, and EPS is forecast to soar from $5.80 last year to over $7.84 by 2025. This makes it a great pick for any investor.

Intel (NYSE:INTC) has specialized in semiconductors since its founding in Mountain View, California in 1968. The company is the worlds largest manufacturer of semiconductors and CPUs and has been since its founding. Intels stock is at about $32 and the average price target is $39.63, with a low of $17 and a high of $68. This would mean an upside potential of almost 24%, on average.

The company has invested heavily in quantum computing in the past several years and is currently putting its expertise to good use, creating hot silicon spin qubits. Qubits are essentially small computing devices that perform differently than typical transistors while also operating at high temperatures.

The company is working diligently on applying the qubits into quantum computing chips that can be used to advance countless fields, while also working with AI systems. All this work is not for nothing. The company is translating this into earnings growth with EPS expected to rise from $1.01 at the end of this year to $1.80 by the end of 2025. As such, this stock should be on any investors watch list.

On the date of publication, Ian Hartana and Vayun Chugh did not hold (either directly or indirectly) any positions in the securities mentioned in this article. The opinions expressed in this article are those of the writer, subject to the InvestorPlace.com Publishing Guidelines.

Chandler Capital is the work of Ian Hartana and Vayun Chugh. Ian Hartana and Vayun Chugh are both self-taught investors whose work has been featured in Seeking Alpha. Their research primarily revolves around GARP stocks with a long-term investment perspective encompassing diverse sectors such as technology, energy, and healthcare.

Go here to see the original:
The 2025 Millionaire's Club: 3 Quantum Computing Stocks to Buy Now - InvestorPlace

Quantum Computing Enters the Atomic Realm – Optics & Photonics News

Atom-based architectures may have a scalability advantage over other platforms in the quest to build more powerful quantum processors.

An experimental scheme demonstrated by researchers at Princeton and Yale universities is able to convert physical noise into errors that can be corrected more easily. [F. Wojciechowski / Princeton University]

Quantum computers built from arrays of ultracold atoms have recently emerged as a serious contender in the quest to create qubit-powered machines that can outperform their classical counterparts. While other hardware architectures have yielded the first fully functioning quantum processors to be available for programming through the cloud, recent developments suggest that atom-based platforms might have the edge when it comes to future scalability.

The scalability advantage of atom-based platforms stems from the exclusive use of photonic technologies to cool, trap and manipulate the atomic qubits.

That scalability advantage stems from the exclusive use of photonic technologies to cool, trap and manipulate the atomic qubits. Side-stepping the need for complex cryogenic systems or the intricacies of chip fabrication, neutral-atom quantum computers can largely be built from existing optical components and systems that have already been optimized for precision and reliability.

The traps are optical tweezers, the atoms are controlled with laser beams and the imaging is done with a camera, says Jeff Thompson, a physicist at Princeton University, USA, whose team has been working to build a quantum computer based on arrays of ytterbium atoms. The scalability of the platform is limited only by the engineering that can be done with the optical system, and there is a whole industry of optical components and megapixel devices where much of that work has already been done.

Jeff Thompson and his team at Princeton University, USA, have pioneered the use of ytterbium atoms to encode and manipulate quantum information. [S.A. Khan / Fotobuddy]

Such ready availability of critical components and systems has enabled both academic groups and commercial companies to scale their quantum processors from tens of atomic qubits to several hundred in the space of just a few years. Then, in November 2023, the California-based startup Atom Computing announced that it had populated a revamped version of its commercial system with almost 1,200 qubitsmore than had yet been reported for any hardware platform. Its exciting to be able to showcase the solutions we have been developing for the past several years, says Ben Bloom, who founded the company in 2018 and is now its chief technology officer. We have demonstrated a few firsts along the way, but while we have been building, the field has been getting more and more amazing.

Neutral atoms offer many appealing characteristics for encoding quantum information. For a start, they are all identicalcompletely free of any imperfections that may be introduced through fabricationwhich means that they can be controlled and manipulated without the need to tune or calibrate individual qubits. Their quantum states and interactions are also well understood and characterized, while crucial quantum properties such as superposition and entanglement are maintained over long enough timescales to perform computational tasks.

However, early attempts to build quantum computers from neutral atoms met with two main difficulties. The first was the need to extend existing methods for trapping single atoms in optical tweezers to create large-scale atomic arrays. Although technologies such as spatial light modulators enable laser beams to be used to produce a regular pattern of microtraps, loading the atoms into the tweezers is a stochastic processwhich means that the probability of each trap being occupied is 50%. As a result, the chances of creating a defect-free array containing large numbers of atoms becomes vanishingly small.

The solution came in 2016, when three separate groupsbased at the Institut dOptique, France, Harvard University, USA, and the Korea Advanced Institute of Science and Technology (KAIST), Republic of Koreademonstrated a concept called rearrangement. In this scheme, an image is taken of the atoms when they are first loaded into the tweezers, which identifies which sites are occupied and which are empty. All the vacant traps are switched off, and then the loaded ones are moved to fill the gaps in the array. This shuffling procedure can be achieved, for example, by using acousto-optic deflectors to alter the positions of the trapping laser beams, creating dynamic optical tweezers that can be combined with real-time control to assemble large arrays of single atoms in less than a second.

[Enlarge image]Large defect-free arrays of single atoms can be created through the process of rearrangement. In this example, demonstrated by a team led by Antoine Browaeys of the Institut dOptique, France, an ordered array of 324 atoms was created from 625 randomly filled traps. [Reprinted with permission from K.-N. Schymik, Phys. Rev. A 106, 022611(2022); 2022 by the American Physical Society]

Before that, there were lots of complicated ideas for generating single-atom states in optical tweezers, remembers Thompson. This rearrangement technique enabled the creation of large arrays containing one hundred or so single atoms without defects, and that has since been extended to much higher numbers.

In these atomic arrays, the qubits are encoded in two long-lived energy states that are controlled with laser light. In rubidium, for example, which is often used because its well-understood atomic transitions can be manipulated relatively easily, the single outermost electron occupies one of two distinct energy levels in the ground state, caused by the coupling between the electron spin and the nuclear spin. The atoms are easily switched between these two energy states by flipping the spins relative to each other, which is achieved with microwave pulses tuned to 6.8 GHz.

While atoms in these stable low-energy levels offer excellent single-qubit properties, the gate operations that form the basis of digital computation require the qubits to interact and form entangled states. Since the atoms in a tweezer array are too far apart for them to interact while remaining in the ground state, a focused laser beam is used to excite the outermost electron into a much higher energy state. In these highly excited Rydberg states, the atom becomes physically much larger, generating strong interatomic interactions on sub-microsecond timescales.

One important effect of these interactions is that the presence of a Rydberg atom shifts the energy levels in its nearest neighbors, preventing them from being excited into the same high-energy state. This phenomenon, called the Rydberg blockade, means that only one of the atoms excited by the laser will form a Rydberg state, but its impossible to know which one. Such shared excitations are the characteristic feature of entanglement, providing an effective mechanism for controlling two-qubit operations between adjacent atoms in the array.

Until recently, however, the logic gates created through two-atom entanglement were prone to errors. For a long time, the fidelity of two-qubit operations hovered at around 80%much lower than could be achieved with superconducting or trapped-ion platforms, says Thompson. That meant that neutral atoms were not really taken seriously for gate-based quantum computing.

The sources of these errors were not fully understood until 2018, when breakthrough work by Antoine Browaeys and colleagues at the Institut dOptique and Mikhail Lukins team at Harvard University analyzed the effects of laser noise on the gate fidelities. People had been using very simple models of the laser noise, says Thompson. With this work, they figured out that phase fluctuations were the major contributor to the high error rates.

At a stroke, these two groups showed that suppressing the laser phase noise could extend the lifetime of the Rydberg states and boost the fidelity of preparing two-qubit entangled states to 97%. Further enhancements since then have yielded two-qubit gate fidelities of more than 99%the minimum threshold for fault-tolerant quantum computing.

While rubidium continues to be a popular choice, several groups believe that ytterbium could offer some crucial benefits for large-scale quantum computing.

That fundamental advance established atomic qubits as a competitive platform for digital quantum computing, catalyzing academic groups and quantum startups to explore and optimize the performance of different atomic systems. While rubidium continues to be a popular choice, several groups believe that ytterbium could offer some crucial benefits for large-scale quantum computing. Ytterbium has a nuclear spin of one half, which means that the qubit can be encoded purely in the nuclear spin, explains Thompson. While all qubits based on atoms or ions have good coherence by default, we have found that pure nuclear-spin qubits can maintain coherence times of many seconds without needing any special measures.

Pioneering experiments in 2022 by Thompsons Princeton group, as well as by a team led by Adam Kaufman at JILA in Boulder, CO, USA, first showed the potential of the ytterbium-171 isotope for producing long-lived atomic qubits. Others have followed their lead, with Atom Computing replacing the strontium atoms in its original prototype with ytterbium-171 in the upgraded 1,200-qubit platform. Strontium also supports nuclear qubits, but we found that we needed to do lots of quantum engineering to achieve long coherence times, says Bloom. With ytterbium, we can achieve coherence times of tens of seconds without the need for any of those extra tricks.

Atom Computings first-generation quantum computer exploited around 100 qubits of single strontium atoms, while its next-generation platform can accommodate around 1,200 ytterbium atoms. [Atom Computing]

The rich energy-level structure of ytterbium also provides access to a greater range of atomic transitions from the ground state, offering new ways to manipulate and measure the quantum states. Early experiments have shown, for example, that this additional flexibility can be exploited to measure some of the qubits while a quantum circuit is being run but without disturbing the qubits that are still being used for logical operations.

Indeed, the ability to perform these mid-circuit measurements is a critical requirement for emerging schemes to locate and correct physical errors in the system, which have so far compromised the ability of quantum computers to perform complex computations. These physical errors are caused by noise and environmental factors that perturb the delicate quantum states, with early estimates suggesting that millions of physical qubits might be needed to provide the redundancy needed to achieve fault-tolerant quantum processing.

More recently, however, it has become clear that fewer qubits may be needed if the physical system can be engineered to limit the impact of the errors. One promising approach is the concept of erasure conversiondemonstrated in late 2023 by a team led by Thompson and Shruti Puri at Yale University, USAin which the physical noise is converted into errors with known locations, also called erasures.

In their scheme, the qubits are encoded in two metastable states of ytterbium, for which most errors will cause them to decay back to the ground state. Importantly, those transitions can easily be detected without disturbing the qubits that are still in the metastable state, allowing failures to be spotted while the quantum processor is still being operated. We just flash the atomic array with light after a few gate operations, and any light that comes back illuminates the position of the error, explains Thompson. Just being able to see where they are could ultimately reduce the number of qubits needed for error correction by a factor of ten.

Experiments by the Princeton researchers show that their method can currently locate 56% of the errors in single-qubit gates and 33% of those in two-qubit operations, which can then be discarded to reduce the effects of physical noise. The team is now working to increase the fidelity that can be achieved when using these metastable states for two-qubit operations, which currently stands at 98%.

A team led by Mikhail Lukin (right) at Harvard University, USA, pictured with lab member Dolev Bluvstein, created the first programmable logical quantum processor, capable of encoding up to 48 logical qubits. [J. Chase / Harvard Staff Photographer]

Meanwhile, Lukins Harvard group, working with several academic collaborators and Boston-based startup QuEra Computing, has arguably made the closest approach yet to error-corrected quantum computing. One crucial step forward is the use of so-called logical qubits, which mitigate the effects of errors by sharing the quantum information among multiple physical qubits.

Previous demonstrations with other hardware platforms have yielded one or two logical qubits, but Lukin and his colleagues showed at the end of 2023 that they could create 48 logical qubits from 280 atomic qubits. They used optical multiplexing to illuminate all the rubidium atoms within a logical qubit with identical light beams, allowing each logical block to be moved and manipulated as a single unit. Since each atom in the logical block is addressed independently, this hardware-efficient control mechanism prevents any errors in the physical qubits from escalating into a logical fault.

For more-scalable processing of these logical qubits, the researchers also divided their architecture into three functional zones. The first is used to store and manipulate the logical qubitsalong with a reservoir of physical qubits that can be mobilized on demandensuring that these stable quantum states are isolated from processing errors in other parts of the hardware. Pairs of logical qubits can then be moved, or shuttled, into the second entangling zone, where a single excitation laser drives two-qubit gate operations with a fidelity of more than 99.5%. In the final readout zone, the outcome of each gate operation is measured without affecting the ongoing processing tasks.

[Enlarge image]Schematic of the logical processor, split into three zones: storage, entangling and readout. Logical single-qubit and two-qubit operations are realized transversally with efficient, parallel operations. [D. Bluvstein et al. Nature,626, 58 (2024); CC-BY-NC 4.0]

The team also configured error-resistant quantum circuits to run on the logical processor, which in one example yielded a fidelity of 72% when operating on 10 logical qubits, increasing to 99% when the gate errors detected in the readout zone at the end of each operation were discarded. When running more complex quantum algorithms requiring hundreds of logical gates, the performance was up to 10 times better when logical qubits were used instead of their single-atom counterparts.

While this is not yet full error correction, which would require the faults to be detected and reset in real time, this demonstration shows how a logical processor can work in tandem with error-resistant software to improve the accuracy of quantum computations. The fidelities that can be achieved could be improved still further by sharing the quantum information among more physical qubits, with QuEras technology roadmap suggesting that by 2026 it will be using as many as 10,000 single atoms to generate 100 logical qubits. This is a truly exciting time in our field as the fundamental ideas of quantum error correction and fault tolerance start to bear fruit, Lukin commented. Although there are still challenges ahead, we expect that this new advance will greatly accelerate the progress toward large-scale, useful quantum computers.

In another notable development, QuEra has also won a multimillion-dollar contract to build a version of this logical processor at the UKs National Quantum Computing Centre (NQCC). The QuEra system will be one of seven prototype quantum computers to be installed at the national lab by March 2025, with others including a cesium-based neutral-atom system from Infleqtion (formerly ColdQuanta) and platforms exploiting superconducting qubits and trapped ions.

Once built, these development platforms will be used to understand and benchmark the capabilities of different hardware architectures, explore the types of applications that suit each one, and address the key scaling challenges that stand in the way of fault-tolerant quantum computing. We know that much more practical R&D will be needed to bridge the gap between currently available platforms and a fully error-corrected neutral-atom quantum computer with hundreds of logical qubits, says Nicholas Spong, who leads the NQCCs activities in tweezer-array quantum computing. For neutral-atom architectures, the ability to scale really depends on engineering the optics, lasers and control systems.

Researchers at the Boston-based startup QuEra, which collaborates on neutral-atom quantum computing with Mikhail Lukins group at Harvard University, USA. [Courtesy of QuEra]

One key goal for hardware developers will be to achieve the precision needed to control the spin rotations of individual atoms as they become more closely packed into the array. While global light fields and qubit shuttling provide efficient and precise control mechanisms for bulk operations, single-atom processes must typically be driven by focused laser beams operating on the scale of tens of nanometers.

To relax the strict performance criteria for these local laser beams, Thompsons group has demonstrated an alternative solution that works for divalent atoms such as ytterbium. We still have a global gate beam, but then we choose which atoms experience that gate by using a focused laser beam to shift specific atoms out of resonance with the global light field, he explains. It doesnt really matter how big the light shift is, which means that this approach is more robust to variations in the laser. Being able to control small groups of atoms in this way is a lot faster than moving them around.

Another key issue is the number of single atoms that can be held securely in the tweezer array. Current roadmaps suggest that arrays containing 10,000 atoms could be realized by increasing the laser power, but scaling to higher numbers could prove tricky. Its a challenge to get hundreds of wattsof laser powerinto the traps while maintaining coherence across the array, explains Spong. The entire array of traps should be identical, but imperfect optics makes it hard to make the traps around the edge work as well as those in the center.

With that in mind, the team at Atom Computing has deployed additional optical technologies in its updated platform to provide a pathway to larger-scale machines. If we wanted to go from 100 to 1,000 qubits, we could have just bought some really big lasers, says Bloom. But we wanted to get on a track where we can continue to expand the array to hundreds of thousands of atoms, or even a million, without running into issues with the laser power.

cA quantum engineer measures the optical power of a laser beam at Atom Computings research and development facility in Boulder, CO, USA. [Atom Computing]

The solution for Atom Computing has been to combine the atomic control provided by optical tweezers with the trapping ability of optical lattices, which are most commonly found in the worlds most precise atomic clocks. These optical lattices exploit the interference of laser beams to create a grid of potential wells on the subwavelength scale, and their performance can be further enhanced by adding an optical buildup cavity to generate constructive interference between many reflected laser beams. With these in-vacuum optics, we can create a huge array of deep traps with only a moderate amount of laser power, says Bloom. We chose to demonstrate an array that can trap 1,225 ytterbium atoms, but theres no reason why we couldnt go much higher.

Importantly, in a modification of the usual rearrangement approach, this design also allows the atomic array to be continuously reloaded while the processor is being operated. Atoms held in a magneto-optical trap are first loaded into a small reservoir array, from which they are transferred into the target array that will be used for computation. The atoms in both arrays are then moved into the deep trapping potential of the optical lattice, where rapid and low-loss fluorescence imaging determines which of the sites are occupied. Returning the atoms to the optical tweezers then allows empty sites within the target array to be filled from the reservoir, with multiple loading cycles yielding an occupancy of 99%.

Researchers working in the field believe the pace of progress is already propelling the technology toward the day when a neutral-atom quantum computer will outperform a classical machine.

Repeatedly replenishing the reservoir with fresh atoms ensures that the target array is always full of qubits, which is essential to prevent atom loss during the execution of complex quantum algorithms. Large-scale error-corrected computations will require quantum information to survive long past the lifetime of a single qubit, Bloom says. Its all about keeping that calculation going when you have hundreds of thousands of qubits.

While many challenges remain, researchers working in the field believe the pace of progress in recent years is already propelling the technology toward the day when a neutral-atom quantum computer will be able to outperform a classical machine. Neutral atoms allow us to reach large numbers of qubits, achieve incredibly long coherence times and access novel error-correction codes, says Bloom. As an engineering firm, we are focused on improving the performance still further, since all thats really going to matter is whether you have enough logical qubits and sufficiently high gate fidelities to address problems that are interesting for real-world use cases.

Susan Curtis is a freelance science and technology writer based in Bristol, UK.

Read more from the original source:
Quantum Computing Enters the Atomic Realm - Optics & Photonics News

ISC 2024 A Few Quantum Gems and Slides from a Packed QC Agenda – HPCwire

If you were looking for quantum computing content, ISC 2024 was a good place to be last week there were around 20 quantum computing related sessions. QC even earned a slide in Kathy Yelicks opening keynote Beyond Exascale. Many of the quantum sessions (and, of course, others) were video-recorded and ISC has now made them freely accessble.

Not all were recorded. For example what sounded like a tantalizing BOF panel Toward Hardware Agnostic Standards in Hybrid HPC/Quantum Computing featuring Bill Gropp (NCSA, University of Illinois), Philippe Deniel (Commissariat Energie Atomique (CEA)), Mitsuhisa Sato (RIKEN), Travis Humble (ORNL), Venkatesh Kannan (Irelands High Performance Centre), and Kristel Michielsen (Julich Supercomputing Center). Was sorry to miss that.

Regardless, theres a wealth of material online and its worth looking through the ISC 2024 inventory for subjects, speakers, and companies of interest (registration may be required). Compiled below are a few QC soundbites from ISC.

Yelick, vice chancellor for research at the University of California, covered a lot of ground in her keynote examining the tension and opportunities emerging from the clash of traditional FP64 HPC and mixed-precision AI and how the commercial supply line of advanced chips is changing. Quantum computing earned a much smaller slice.

I really just have this one slide about quantum. Theres been some really exciting progress if you have been following this and things like error correction over the last year with really, significant improvements in terms of the ability to build error corrected quantum systems. On the other hand, I would say we dont yet have an integrated circuit kind of transistor model yet, right. Weve got a bunch of transistors, [i.e.] weve got a whole bunch of different kinds of qubits that you can build, [and] theres still some debate [over them].

In fact, the latest one of the latest big error correction results was actually not for the superconducting qubits, which is what a lot of the early startups were in, but for the AMO (atomic, molecular, optical) physics. So this is really looking at the fact that were not yet at a place where we can rely on this for the next generation of computing, which is not to say that we should be ignoring it. Im really interested to see how [quantum computing evolves and] also thinking about how much classical computing were going to need with quantum because thats also going to be a big challenge with quantum. [Its] very exciting, but its not replacing also general purpose kind of computing that we do for science and engineering.

Not sure if thats a glass half-full or half-empty perspective. Actually, many of the remaining sessions tackled the questions she posed, including the best way to implement hyrbid HPC-Quantum system, error correction and error mitigation, and the jostling among competing qubit types.

It was easy to sympathize (sort of) with speakers presenting at the Quantum Computing Status of Technologies session, moderated by Valeria Bartsch of Fraunhofer CFL. The speakers came from companies developing different qubit modalities and, naturally, at least a small portion of their brief talks touted their company technology.

She asked, Heres another [submitted question]. What is the most promising quantum computing technology that your company is not developing yourself? I love that one. And everybody has to answer it now. You can think for a few seconds.

Very broadly speaking neutral atom, trapped ion, and superconducting are perhaps the most advanced qubit modalities currently and each speaker presented a bit of background on their companies technology and progress. Trapped ions boast long coherence times but somewhat slower swicthing speeds. Superconducting qubits are fast, and perhaps easier to scale, but error prone. Neutral atoms also have long coherence times but have so far been mostly used for analog computing though efforts are moving quickly to implement gate-based computing. To Hayes point, Marjorana (topology) qubits would be inherently resistant to error.

Not officially part of the ISC program, Hyperion delivered its mid-year HPC market update online just before the conference. The full HPCwire coverage is here and Hyperion said it planned to put its recorded presentation and slides available on its website. Chief Quantum Analyst Bob Sorensen provided a brief QC snapshot during the update predicting the WW QC market will surpass $1 billion in 2025.

Sorensen noted, So this is a quick chart (above) that just shows the combination of the last four estimates that we made, you can see starting in 2019, all the way up to this 2023 estimate that reaches that $1.5 billion in 2026 I talked about earlier. Now my concern here is always its dangerous to project out too far. So we do tend to limit the forecast to these kinds of short ranges, simply because a nascent sector like quantum, which has so much potential, but at the same time has some significant technical hurdles to overcome [which] means that there can be an inflection point most likely though in the upward direction.

He also pointed out that a new use case, a new breakthrough in modality or algorithms, any kind of significant driver that brings more interest in and performance to quantum kick can significantly change the trajectory here on the upside.

Sorensen said, Just to give you a sense of how these vendors that we spoke to looked at algorithms, we see the big three are still the big three in mod-sim, optimization, and AI with with some interest in cybersecurity aspects, post quantum encryption kinds of research and such as well as Monte Carlo processes taking advantage of quantum stability to generate random number generator, provable random numbers to support the Monte Carlo processing.

Interesting here is that were seeing a lot more other (17%). This is the first time weve seen that. We think it is [not so much] about new algorithms, but perhaps hybrid mod-sim optimized or machine learning that feeds into the optimization process. So we think were seeing more hybrid applications emerging as people take a look at the algorithms and decide what solves the use case that they have in hand, he said.

Satoshi Matsuoka, director of RIKEN Center for Computational Science, provided a quick overview of Fugaku plans for incorporating quantum computing as well as touching on the status of the ABCI-Q project. He, of course, has been instrumental with both systems. Both efforts emphasize creating a hybrid HPC-AI-Quantum infrastructure.

The ABCI-Q infrastructure (slide below) will be a variety of quantum-inspired and actual quantum hardware. Fujitsu will supply the former systems. Currently, quantum computers based on neutral atoms, superconducting qubits, and photonics are planned. Matsuoka noted this is well-funded a few $100 million with much of the work done geared toward industry.

Rollout of the integrated quantum-HPC hybrid infrastructure at Fugaku is aimed at the 2024/25 timeframe. Its also an ambitious effort.

About the Fugaku effort, Matsuoka said, [This] project is funded by a different ministry, in which we have several real quantum computers, IBMs Heron (superconducting QPU), a Quantinuum (trapped ion qubits), and quantum simulators. So real quantum computers and simulators to be coupled with Fugaku.

The objective of the project [is to] come up with a comprehensive software stack, such that when the real quantum computers that are more useful come online, then we can move the entire infrastructure along with any of those with quantum computers along with their successors to be deployed to solve real problems. This will be one of the largest hybrid supercomputers.

The aggressive quantum-HPC integration sounds a lot like what going on in Europe. (See HPCwire coverage, Europes Race towards Quantum-HPC Integration and Quantum Advantage)

The topic of benchmarking also came up during Q&A at one session. A single metric such as the Top500 is generally not preferred. But what then, even now during the so-called NISQ (noisy intermediate-scale quantum) computing era?

One questioner said, Lets say interesting algorithms and problems. Is there anything like, and Im not talking about a top 500 list for quantum computers, like an algorithm where we can compare systems? For example, Shors algorithm. So who did it and what is the best performance or the largest numbers you were able to factorize?

Hayes (Quantinuum) said, So we havent attempted to run Shors algorithm, and interesting implementations of Shors algorithm are going to require fault tolerance to factor a number that a classical computer cant. But you know, that doesnt mean it cant be a nice benchmark to see which company can factor the largest one. I did show some data on the quantum Fourier transform. Thats a primitive in Shors algorithm. I would say that thatd be a great candidate for benchmarking the progress and fault tolerance.

More interesting benchmarks for the NISC era are things like quantum volume, and theres some other ones that can be standardized, and you can make fair comparisons. So we try to do that. You know, theyre not widely or universally adopted, but there are organizations out there trying to standardize them. Its difficult getting everybody marching in the same direction.

Corcoles (IBM) added, I think benchmarking in quantum has an entire community around it, and they have been working on it for more than a decade. I read your question as focusing on application-oriented benchmarks versus system-oriented benchmarks. There are layers of subtlety there as well. If we think about Shors algorithm, for example, there were recent works last year suggesting theres more than one way to run Shors. Depending on the architecture, you might choose one or another way.

An architecture that is faster might choose to run many circuits in parallel that can capture Shors algorithm and then do a couple of processing or architecture that that might might take more time they just want to run one single circuit with high probability measure the right action. You could compare run times, but theres probably going to be differences that add to the uncertainty of what what technology you will use, meaning that there might be a regime of factoring, where you might want to choose one aspect or another, but then your particular physical implement, he said.

Macri (QuEra) said, My point is were not yet at the point where we can really [compare systems]. You know we dont want to compete directly with our technologies. I would say that especially in for what concerns applications we need to adopt a collaborative approach. So for example, there are certain areas where these benchmarks that you mentioned are not really applicable. One of them is a quantum simulation and we have seen really a lot of fantastic results from our technology, as well as from ion traps and superconducting qubits.

It doesnt really make sense really to compare the basic features of the technologies so that, you know, we can a priori, identify what is the specific application the result that you want to achieve. I would say lets focus on advancing the technology we see. We already know that there are certain types of devices that outperform others for specific applications. And then we will, we will decide these perhaps at a later stage. But I agreed for for very complex tasks, such as quantum Fourier transform, or perhaps the Shors algorithm, but I think, to be honest, its still too preliminary [for effective system comparisons].

As noted this was a break-out year for quantum at ISC which has long had quantum sessions but not as many. Europes aggressive funding, procurements, and HPC-quantum integration efforts make it clear it does not intend to be left behind in the quantum computing land rush, with, hopefully, a gold rush to follow.

Stay tuned.

See the original post:
ISC 2024 A Few Quantum Gems and Slides from a Packed QC Agenda - HPCwire

The risks of quantum computers for electronic identity documents and how to counter them – Identity Week

Quantum computers will be a game changer in many areas where complex calculations are required. However, they also entail a risk that should not be underestimated: current cryptography algorithms, such as those used in electronic ID documents and smart cards, might be compromised in future with quantum computers. Post-quantum cryptography is intended to mitigate this risk. But there is not much time left for the preparations.

In contrast to classical computers, quantum computers have the potential to perform complex calculations at unprecedented speeds. They use so-called qubits, which, unlike conventional bits, are not either 0 or 1, but can be in both states simultaneously. This allows quantum computers to perform several calculations parallelly, much faster, and thus solve problems that cannot be mastered with the computing power of todays systems. As a result, they enable significant advances in many fields of application, for example in searching through large databases, simulation of chemical and physical reactions, and in material design. On the other hand, they also enable the fast prime factorisation of long integers and by that they have the disruptive potential to break various encryption algorithms currently used. It is commonly assumed that quantum computer attacks on todays cryptography will become reality within the next 10 to 20 years.

This will certainly have a game-changing effect on the cryptographic security of identity documents like eID cards, especially as they often have a regular lifetime of 10 years and more. The established and widely used encryption algorithms such as RSA (Rivest Shamir Adelman) and ECC (Elliptic Curve Cryptography) deployed in those electronic ID documents and smart cards will be heavily affected by future universal quantum computers. Equally, quantum computers have the potential to disruptively threaten algorithms like ECDSA (Elliptic Curve Digital Signature Algorithm) and protocols like ECDH (Elliptic Curve Diffie-Hellman).

Post-quantum cryptography (PQC) aims to repel the cryptanalysis performed on both quantum and classical computers. PQC schemes are executed on conventional computers and security controllers and do not need a quantum computer to work. From the users perspective, they behave similarly to currently available ciphers (e.g., RSA or ECC). PQC schemes rely on new and fundamentally different mathematical foundations. This leads to new challenges when implementing PQC on small chips with limited storage space.

Standardization and adoption are needed

In 2017, the US National Institute of Standards and Technology (NIST) started its post-quantum crypto project and asked for submissions of post-quantum key exchange, public-key encryption, and signature schemes to a competition-like standardisation effort. NIST plans to finalise the first standards for PQC algorithms in summer 2024.

Infineon experts have been working at the forefront of PQC algorithms for years. For example, Infineon contributed to two submissions to the NIST PQC standardisation process, the stateless hash-based signature scheme SPHINCS+ and the NewHope key-exchange protocol.

Besides standardisation, the adoption of infrastructure is required. Communication protocols need to be adapted and standardized. Documents and infrastructure, including the background systems, need to be upgraded.

The transition from todays conventional algorithms to PQC will be gradual. The speed of migration depends not only on the availability of quantum computers, but also on the extent to which security is critical for the applications in question, the lifetime of devices in the field, and many other factors. How can device vendors navigate all these uncertainties?

One promising path to success lies in crypto agility: devices should be able to evolve to support different crypto algorithms. Adaptability in this dynamic space hinges on the ability to add and exchange crypto algorithms and the corresponding protocols.

Infineon is involved in publicly funded projects and actively advises customers on secure migration to quantum-safe cryptography. In 2022, together with the German Federal Printing Office (Bundesdruckerei GmbH) and the Fraunhofer Institute for Applied and Integrated Security, Infineon demonstrated a quantum computer-resistant version of the Extended Access Control (EAC) protocol for an ePassport with the objective to showcase the feasibility of a quantum-secured ePassport. At the core of the demonstrator is a security controller from Infineon, which protects the data against both conventional and quantum computer attacks.

Early preparation is key

Although the first standardised algorithms are expected in 2024, the rapid development of quantum computing signals the importance of early preparation. Knowledge and expertise will be essential to put appropriate and commercially feasible solutions in place in a timely manner. A good way to familiarise yourself with PQC is working on demonstrators and preparing a timely start with first although limited field trials. First pilot projects for national eID cards are expected to start shortly after 2025. First wide-scale rollouts of quantum-safe documents are expected to start before the end of this decade.

Governments and other ID document-issuing organisations should prepare so that they do not risk exposure to the threat of quantum computing. This starts with learning about PQC and developing strategic plans and migration strategies. They need to think about infrastructure, document upgrades, the impact of PQC on their software and hardware (key sizes, required memory) and so on. And all of this should be done as early as possible to overcome all challenges in good time, because moving to PQC affects the whole lifecycle of a document from industrialisation, personalisation and issuance to operational usage and field updates.

Link:
The risks of quantum computers for electronic identity documents and how to counter them - Identity Week

NIST quantum-resistant algorithms to be published within weeks, top White House advisor says – The Record from Recorded Future News

The U.S. National Institute of Standards and Technology (NIST) will release four post-quantum cryptographic algorithms in the next few weeks, a senior White House official said on Monday.

Anne Neuberger, the White Houses top cyber advisor, told an audience at the Royal United Services Institute (RUSI) in London that the release of the algorithms was a momentous moment, as they marked a major step in the transition to the next generation of cryptography.

The transition is being made in apprehension of what is called a cryptographically relevant quantum computer (CRQC), a device theoretically capable of breaking the encryption thats at the root of protecting both corporate and national security secrets, said Neuberger. NIST made a preliminary announcement of the algorithms in 2022.

Conrad Prince, a former official at GCHQ and now a distinguished fellow at RUSI, told Neuberger that during his previous career there had consistently been a concern about hostile states having the capability to decrypt the plaintext of secure messages, although this capability was consistently estimated at being roughly a decade away and had been for the last 20 years.

Neuberger said the U.S. intelligence communitys estimate is similar, the early 2030s, for when a CRQC would be operational. But the time-frame is relevant, said the White House advisor, because there is national security data that is collected today and even if decrypted eight years from now, can still be damaging.

Britains NCSC has warned that contemporary threat actors could be collecting and storing intelligence data today for decryption at some point in the future.

Given the cost of storing vast amounts of old data for decades, such an attack is only likely to be worthwhile for very high-value information, stated the NCSC. As such, the possibility of a CRQC existing at some point in the next decade is a very relevant threat right now.

Neuberger added: Certainly theres some data thats time sensitive, you know, a ship that looks to be transporting weapons to a sanctioned country, probably in eight years we dont care about that anymore.

Publishing the new NIST algorithms is a protection against adversaries collecting the most sensitive kinds of data today, Neuberger added.

A spokesperson for NIST told Recorded Future News: The plan is to release the algorithms this summer. We dont have anything more specific to offer at this time.

But publishing the algorithms is not the last step in moving to a quantum-resistant computing world. The NCSC has warned it is actually just the second step in what will be a very complicated undertaking.

Even if any one of the algorithms proposed by NIST achieves universal acceptance as something that is unbreakable by a quantum computer, it would not be a simple matter of just swapping those algorithms in for the old-fashioned ones.

Part of the challenge is that most systems that currently depend on public-key cryptography for their security are not necessarily capable of running the resource-heavy software used in post-quantum cryptography.

Ultimately, the security of public key cryptographic systems relies on the mathematical difficulty of factoring very large prime numbers something that traditional computers find exhaustingly difficult.

However, research by American mathematician Peter Shor, published in 1994, proposed an algorithm that could be run on a quantum computer for finding these prime factors with far more ease; potentially undermining some of the key assumptions about what makes public-key cryptography secure.

The good news, according to NCSC, is that while advances in quantum computing are continuing to be made, the machines that exist today are still limited, and suffer from relatively high error rates in each operation they perform, stated the agency.

But the NCSC warned that in the future, it is possible that error rates can be lowered such that a large, general-purpose quantum computer could exist, but it is impossible to predict when this may happen.

Recorded Future

Intelligence Cloud.

No previous article

No new articles

Alexander Martin

is the UK Editor for Recorded Future News. He was previously a technology reporter for Sky News and is also a fellow at the European Cyber Conflict Research Initiative.

Read the original here:
NIST quantum-resistant algorithms to be published within weeks, top White House advisor says - The Record from Recorded Future News