Archive for the ‘Quantum Computer’ Category

ISC 2024 A Few Quantum Gems and Slides from a Packed QC Agenda – HPCwire

If you were looking for quantum computing content, ISC 2024 was a good place to be last week there were around 20 quantum computing related sessions. QC even earned a slide in Kathy Yelicks opening keynote Beyond Exascale. Many of the quantum sessions (and, of course, others) were video-recorded and ISC has now made them freely accessble.

Not all were recorded. For example what sounded like a tantalizing BOF panel Toward Hardware Agnostic Standards in Hybrid HPC/Quantum Computing featuring Bill Gropp (NCSA, University of Illinois), Philippe Deniel (Commissariat Energie Atomique (CEA)), Mitsuhisa Sato (RIKEN), Travis Humble (ORNL), Venkatesh Kannan (Irelands High Performance Centre), and Kristel Michielsen (Julich Supercomputing Center). Was sorry to miss that.

Regardless, theres a wealth of material online and its worth looking through the ISC 2024 inventory for subjects, speakers, and companies of interest (registration may be required). Compiled below are a few QC soundbites from ISC.

Yelick, vice chancellor for research at the University of California, covered a lot of ground in her keynote examining the tension and opportunities emerging from the clash of traditional FP64 HPC and mixed-precision AI and how the commercial supply line of advanced chips is changing. Quantum computing earned a much smaller slice.

I really just have this one slide about quantum. Theres been some really exciting progress if you have been following this and things like error correction over the last year with really, significant improvements in terms of the ability to build error corrected quantum systems. On the other hand, I would say we dont yet have an integrated circuit kind of transistor model yet, right. Weve got a bunch of transistors, [i.e.] weve got a whole bunch of different kinds of qubits that you can build, [and] theres still some debate [over them].

In fact, the latest one of the latest big error correction results was actually not for the superconducting qubits, which is what a lot of the early startups were in, but for the AMO (atomic, molecular, optical) physics. So this is really looking at the fact that were not yet at a place where we can rely on this for the next generation of computing, which is not to say that we should be ignoring it. Im really interested to see how [quantum computing evolves and] also thinking about how much classical computing were going to need with quantum because thats also going to be a big challenge with quantum. [Its] very exciting, but its not replacing also general purpose kind of computing that we do for science and engineering.

Not sure if thats a glass half-full or half-empty perspective. Actually, many of the remaining sessions tackled the questions she posed, including the best way to implement hyrbid HPC-Quantum system, error correction and error mitigation, and the jostling among competing qubit types.

It was easy to sympathize (sort of) with speakers presenting at the Quantum Computing Status of Technologies session, moderated by Valeria Bartsch of Fraunhofer CFL. The speakers came from companies developing different qubit modalities and, naturally, at least a small portion of their brief talks touted their company technology.

She asked, Heres another [submitted question]. What is the most promising quantum computing technology that your company is not developing yourself? I love that one. And everybody has to answer it now. You can think for a few seconds.

Very broadly speaking neutral atom, trapped ion, and superconducting are perhaps the most advanced qubit modalities currently and each speaker presented a bit of background on their companies technology and progress. Trapped ions boast long coherence times but somewhat slower swicthing speeds. Superconducting qubits are fast, and perhaps easier to scale, but error prone. Neutral atoms also have long coherence times but have so far been mostly used for analog computing though efforts are moving quickly to implement gate-based computing. To Hayes point, Marjorana (topology) qubits would be inherently resistant to error.

Not officially part of the ISC program, Hyperion delivered its mid-year HPC market update online just before the conference. The full HPCwire coverage is here and Hyperion said it planned to put its recorded presentation and slides available on its website. Chief Quantum Analyst Bob Sorensen provided a brief QC snapshot during the update predicting the WW QC market will surpass $1 billion in 2025.

Sorensen noted, So this is a quick chart (above) that just shows the combination of the last four estimates that we made, you can see starting in 2019, all the way up to this 2023 estimate that reaches that $1.5 billion in 2026 I talked about earlier. Now my concern here is always its dangerous to project out too far. So we do tend to limit the forecast to these kinds of short ranges, simply because a nascent sector like quantum, which has so much potential, but at the same time has some significant technical hurdles to overcome [which] means that there can be an inflection point most likely though in the upward direction.

He also pointed out that a new use case, a new breakthrough in modality or algorithms, any kind of significant driver that brings more interest in and performance to quantum kick can significantly change the trajectory here on the upside.

Sorensen said, Just to give you a sense of how these vendors that we spoke to looked at algorithms, we see the big three are still the big three in mod-sim, optimization, and AI with with some interest in cybersecurity aspects, post quantum encryption kinds of research and such as well as Monte Carlo processes taking advantage of quantum stability to generate random number generator, provable random numbers to support the Monte Carlo processing.

Interesting here is that were seeing a lot more other (17%). This is the first time weve seen that. We think it is [not so much] about new algorithms, but perhaps hybrid mod-sim optimized or machine learning that feeds into the optimization process. So we think were seeing more hybrid applications emerging as people take a look at the algorithms and decide what solves the use case that they have in hand, he said.

Satoshi Matsuoka, director of RIKEN Center for Computational Science, provided a quick overview of Fugaku plans for incorporating quantum computing as well as touching on the status of the ABCI-Q project. He, of course, has been instrumental with both systems. Both efforts emphasize creating a hybrid HPC-AI-Quantum infrastructure.

The ABCI-Q infrastructure (slide below) will be a variety of quantum-inspired and actual quantum hardware. Fujitsu will supply the former systems. Currently, quantum computers based on neutral atoms, superconducting qubits, and photonics are planned. Matsuoka noted this is well-funded a few $100 million with much of the work done geared toward industry.

Rollout of the integrated quantum-HPC hybrid infrastructure at Fugaku is aimed at the 2024/25 timeframe. Its also an ambitious effort.

About the Fugaku effort, Matsuoka said, [This] project is funded by a different ministry, in which we have several real quantum computers, IBMs Heron (superconducting QPU), a Quantinuum (trapped ion qubits), and quantum simulators. So real quantum computers and simulators to be coupled with Fugaku.

The objective of the project [is to] come up with a comprehensive software stack, such that when the real quantum computers that are more useful come online, then we can move the entire infrastructure along with any of those with quantum computers along with their successors to be deployed to solve real problems. This will be one of the largest hybrid supercomputers.

The aggressive quantum-HPC integration sounds a lot like what going on in Europe. (See HPCwire coverage, Europes Race towards Quantum-HPC Integration and Quantum Advantage)

The topic of benchmarking also came up during Q&A at one session. A single metric such as the Top500 is generally not preferred. But what then, even now during the so-called NISQ (noisy intermediate-scale quantum) computing era?

One questioner said, Lets say interesting algorithms and problems. Is there anything like, and Im not talking about a top 500 list for quantum computers, like an algorithm where we can compare systems? For example, Shors algorithm. So who did it and what is the best performance or the largest numbers you were able to factorize?

Hayes (Quantinuum) said, So we havent attempted to run Shors algorithm, and interesting implementations of Shors algorithm are going to require fault tolerance to factor a number that a classical computer cant. But you know, that doesnt mean it cant be a nice benchmark to see which company can factor the largest one. I did show some data on the quantum Fourier transform. Thats a primitive in Shors algorithm. I would say that thatd be a great candidate for benchmarking the progress and fault tolerance.

More interesting benchmarks for the NISC era are things like quantum volume, and theres some other ones that can be standardized, and you can make fair comparisons. So we try to do that. You know, theyre not widely or universally adopted, but there are organizations out there trying to standardize them. Its difficult getting everybody marching in the same direction.

Corcoles (IBM) added, I think benchmarking in quantum has an entire community around it, and they have been working on it for more than a decade. I read your question as focusing on application-oriented benchmarks versus system-oriented benchmarks. There are layers of subtlety there as well. If we think about Shors algorithm, for example, there were recent works last year suggesting theres more than one way to run Shors. Depending on the architecture, you might choose one or another way.

An architecture that is faster might choose to run many circuits in parallel that can capture Shors algorithm and then do a couple of processing or architecture that that might might take more time they just want to run one single circuit with high probability measure the right action. You could compare run times, but theres probably going to be differences that add to the uncertainty of what what technology you will use, meaning that there might be a regime of factoring, where you might want to choose one aspect or another, but then your particular physical implement, he said.

Macri (QuEra) said, My point is were not yet at the point where we can really [compare systems]. You know we dont want to compete directly with our technologies. I would say that especially in for what concerns applications we need to adopt a collaborative approach. So for example, there are certain areas where these benchmarks that you mentioned are not really applicable. One of them is a quantum simulation and we have seen really a lot of fantastic results from our technology, as well as from ion traps and superconducting qubits.

It doesnt really make sense really to compare the basic features of the technologies so that, you know, we can a priori, identify what is the specific application the result that you want to achieve. I would say lets focus on advancing the technology we see. We already know that there are certain types of devices that outperform others for specific applications. And then we will, we will decide these perhaps at a later stage. But I agreed for for very complex tasks, such as quantum Fourier transform, or perhaps the Shors algorithm, but I think, to be honest, its still too preliminary [for effective system comparisons].

As noted this was a break-out year for quantum at ISC which has long had quantum sessions but not as many. Europes aggressive funding, procurements, and HPC-quantum integration efforts make it clear it does not intend to be left behind in the quantum computing land rush, with, hopefully, a gold rush to follow.

Stay tuned.

See the original post:
ISC 2024 A Few Quantum Gems and Slides from a Packed QC Agenda - HPCwire

The risks of quantum computers for electronic identity documents and how to counter them – Identity Week

Quantum computers will be a game changer in many areas where complex calculations are required. However, they also entail a risk that should not be underestimated: current cryptography algorithms, such as those used in electronic ID documents and smart cards, might be compromised in future with quantum computers. Post-quantum cryptography is intended to mitigate this risk. But there is not much time left for the preparations.

In contrast to classical computers, quantum computers have the potential to perform complex calculations at unprecedented speeds. They use so-called qubits, which, unlike conventional bits, are not either 0 or 1, but can be in both states simultaneously. This allows quantum computers to perform several calculations parallelly, much faster, and thus solve problems that cannot be mastered with the computing power of todays systems. As a result, they enable significant advances in many fields of application, for example in searching through large databases, simulation of chemical and physical reactions, and in material design. On the other hand, they also enable the fast prime factorisation of long integers and by that they have the disruptive potential to break various encryption algorithms currently used. It is commonly assumed that quantum computer attacks on todays cryptography will become reality within the next 10 to 20 years.

This will certainly have a game-changing effect on the cryptographic security of identity documents like eID cards, especially as they often have a regular lifetime of 10 years and more. The established and widely used encryption algorithms such as RSA (Rivest Shamir Adelman) and ECC (Elliptic Curve Cryptography) deployed in those electronic ID documents and smart cards will be heavily affected by future universal quantum computers. Equally, quantum computers have the potential to disruptively threaten algorithms like ECDSA (Elliptic Curve Digital Signature Algorithm) and protocols like ECDH (Elliptic Curve Diffie-Hellman).

Post-quantum cryptography (PQC) aims to repel the cryptanalysis performed on both quantum and classical computers. PQC schemes are executed on conventional computers and security controllers and do not need a quantum computer to work. From the users perspective, they behave similarly to currently available ciphers (e.g., RSA or ECC). PQC schemes rely on new and fundamentally different mathematical foundations. This leads to new challenges when implementing PQC on small chips with limited storage space.

Standardization and adoption are needed

In 2017, the US National Institute of Standards and Technology (NIST) started its post-quantum crypto project and asked for submissions of post-quantum key exchange, public-key encryption, and signature schemes to a competition-like standardisation effort. NIST plans to finalise the first standards for PQC algorithms in summer 2024.

Infineon experts have been working at the forefront of PQC algorithms for years. For example, Infineon contributed to two submissions to the NIST PQC standardisation process, the stateless hash-based signature scheme SPHINCS+ and the NewHope key-exchange protocol.

Besides standardisation, the adoption of infrastructure is required. Communication protocols need to be adapted and standardized. Documents and infrastructure, including the background systems, need to be upgraded.

The transition from todays conventional algorithms to PQC will be gradual. The speed of migration depends not only on the availability of quantum computers, but also on the extent to which security is critical for the applications in question, the lifetime of devices in the field, and many other factors. How can device vendors navigate all these uncertainties?

One promising path to success lies in crypto agility: devices should be able to evolve to support different crypto algorithms. Adaptability in this dynamic space hinges on the ability to add and exchange crypto algorithms and the corresponding protocols.

Infineon is involved in publicly funded projects and actively advises customers on secure migration to quantum-safe cryptography. In 2022, together with the German Federal Printing Office (Bundesdruckerei GmbH) and the Fraunhofer Institute for Applied and Integrated Security, Infineon demonstrated a quantum computer-resistant version of the Extended Access Control (EAC) protocol for an ePassport with the objective to showcase the feasibility of a quantum-secured ePassport. At the core of the demonstrator is a security controller from Infineon, which protects the data against both conventional and quantum computer attacks.

Early preparation is key

Although the first standardised algorithms are expected in 2024, the rapid development of quantum computing signals the importance of early preparation. Knowledge and expertise will be essential to put appropriate and commercially feasible solutions in place in a timely manner. A good way to familiarise yourself with PQC is working on demonstrators and preparing a timely start with first although limited field trials. First pilot projects for national eID cards are expected to start shortly after 2025. First wide-scale rollouts of quantum-safe documents are expected to start before the end of this decade.

Governments and other ID document-issuing organisations should prepare so that they do not risk exposure to the threat of quantum computing. This starts with learning about PQC and developing strategic plans and migration strategies. They need to think about infrastructure, document upgrades, the impact of PQC on their software and hardware (key sizes, required memory) and so on. And all of this should be done as early as possible to overcome all challenges in good time, because moving to PQC affects the whole lifecycle of a document from industrialisation, personalisation and issuance to operational usage and field updates.

Link:
The risks of quantum computers for electronic identity documents and how to counter them - Identity Week

NIST quantum-resistant algorithms to be published within weeks, top White House advisor says – The Record from Recorded Future News

The U.S. National Institute of Standards and Technology (NIST) will release four post-quantum cryptographic algorithms in the next few weeks, a senior White House official said on Monday.

Anne Neuberger, the White Houses top cyber advisor, told an audience at the Royal United Services Institute (RUSI) in London that the release of the algorithms was a momentous moment, as they marked a major step in the transition to the next generation of cryptography.

The transition is being made in apprehension of what is called a cryptographically relevant quantum computer (CRQC), a device theoretically capable of breaking the encryption thats at the root of protecting both corporate and national security secrets, said Neuberger. NIST made a preliminary announcement of the algorithms in 2022.

Conrad Prince, a former official at GCHQ and now a distinguished fellow at RUSI, told Neuberger that during his previous career there had consistently been a concern about hostile states having the capability to decrypt the plaintext of secure messages, although this capability was consistently estimated at being roughly a decade away and had been for the last 20 years.

Neuberger said the U.S. intelligence communitys estimate is similar, the early 2030s, for when a CRQC would be operational. But the time-frame is relevant, said the White House advisor, because there is national security data that is collected today and even if decrypted eight years from now, can still be damaging.

Britains NCSC has warned that contemporary threat actors could be collecting and storing intelligence data today for decryption at some point in the future.

Given the cost of storing vast amounts of old data for decades, such an attack is only likely to be worthwhile for very high-value information, stated the NCSC. As such, the possibility of a CRQC existing at some point in the next decade is a very relevant threat right now.

Neuberger added: Certainly theres some data thats time sensitive, you know, a ship that looks to be transporting weapons to a sanctioned country, probably in eight years we dont care about that anymore.

Publishing the new NIST algorithms is a protection against adversaries collecting the most sensitive kinds of data today, Neuberger added.

A spokesperson for NIST told Recorded Future News: The plan is to release the algorithms this summer. We dont have anything more specific to offer at this time.

But publishing the algorithms is not the last step in moving to a quantum-resistant computing world. The NCSC has warned it is actually just the second step in what will be a very complicated undertaking.

Even if any one of the algorithms proposed by NIST achieves universal acceptance as something that is unbreakable by a quantum computer, it would not be a simple matter of just swapping those algorithms in for the old-fashioned ones.

Part of the challenge is that most systems that currently depend on public-key cryptography for their security are not necessarily capable of running the resource-heavy software used in post-quantum cryptography.

Ultimately, the security of public key cryptographic systems relies on the mathematical difficulty of factoring very large prime numbers something that traditional computers find exhaustingly difficult.

However, research by American mathematician Peter Shor, published in 1994, proposed an algorithm that could be run on a quantum computer for finding these prime factors with far more ease; potentially undermining some of the key assumptions about what makes public-key cryptography secure.

The good news, according to NCSC, is that while advances in quantum computing are continuing to be made, the machines that exist today are still limited, and suffer from relatively high error rates in each operation they perform, stated the agency.

But the NCSC warned that in the future, it is possible that error rates can be lowered such that a large, general-purpose quantum computer could exist, but it is impossible to predict when this may happen.

Recorded Future

Intelligence Cloud.

No previous article

No new articles

Alexander Martin

is the UK Editor for Recorded Future News. He was previously a technology reporter for Sky News and is also a fellow at the European Cyber Conflict Research Initiative.

Read the original here:
NIST quantum-resistant algorithms to be published within weeks, top White House advisor says - The Record from Recorded Future News

How Nvidia co-founder plans to turn Hudson Valley into a tech powerhouse greater than Silicon Valley – New York Post

A co-founder of chip maker Nvidia is bankrolling a futuristic quantum computer system at Rensselaer Polytechnic Institute and wants to turn New Yorks Hudson Valley into a tech powerhouse.

Curtis Priem, 64, donated more than $75 million so that the Albany-area college could obtain the IBM-made computer the first such device on a university campus anywhere in the world, the Wall Street Journal reported.

The former tech executive and RPI alum said his goal is to establish the area around the school, based in Troy, into a hub of talent and business as quantum computing becomes more mainstream in the years ahead.

Weve renamed Hudson Valley as Quantum Valley, Priem told the Journal. Its up to New York whether they want to become Silicon State not just a valley.

The burgeoning technology uses subatomic quantum bits, or qubits, to process data much faster than conventional binary computers. The devices are expected to play a key role in the development of advanced AI systems.

Priem will reportedly fund the whopping $15 million per year required to rent the computer, which is kept in a building that used to be a chapel on RPIs campus.

RPI PresidentMartin Schmidt told the newspaper that the school will begin integrating the device into its curriculum and ensure it is accessible to the student body.

Representatives for IBM and RPI did not immediately return The Posts request for comment.

An electrical engineer by trade, Priem co-founded Nvidia alongside its current CEO Jensen Huang and Chris Malachowsky in 1993. He served as the companys chief technology officer until retiring in 2003.

Priem sold most of his stock in retirement and used the money to start a charitable foundation.

He serves as vice chair of the board at RPI and has reportedly donated hundreds of millions of dollars to the university.

Nvidia has surged in value as various tech firms rely on its computer chips to fuel the race to develop artificial intelligence.

The companys stock has surged 95% to nearly $942 per share since January alone. Nvidias market cap exceeds $2.3 trillion, making it the worlds third-most valuable company behind Microsoft and Apple.

In November 2023, Forbes estimated that Priem would be one of the worlds richest people, with a personal fortune of $70 billion, if he hadnt sold off most of his Nvidia shares.

Go here to read the rest:
How Nvidia co-founder plans to turn Hudson Valley into a tech powerhouse greater than Silicon Valley - New York Post

Aramco signs agreement with Pasqal to deploy first quantum computer in the Kingdom of Saudi Arabia – Aramco

Aramco, one of the worlds leading integrated energy and chemicals companies, has signed an agreement with Pasqal, a global leader in neutral atom quantum computing, to install the first quantum computer in the Kingdom of Saudi Arabia.

The agreement will see Pasqal install, maintain, and operate a 200-qubit quantum computer, which is scheduled for deployment in the second half of 2025.

Ahmad Al-Khowaiter, Aramco EVP of Technology & Innovation, said: Aramco is delighted to partner with Pasqal to bring cutting-edge, high-performance quantum computing capabilities to the Kingdom. In a rapidly evolving digital landscape, we believe it is crucial to seize opportunities presented by new, impactful technologies and we aim to pioneer the use of quantum computing in the energy sector. Our agreement with Pasqal allows us to harness the expertise of a leading player in this field, as we continue to build state-of-the-art solutions into our business. It is also further evidence of our contribution to the growth of the digital economy in Saudi Arabia.

Georges-Olivier Reymond, Pasqal CEO & Co-founder, said: The era of quantum computing is here. No longer confined to theory, it's transitioning to real-world applications, empowering organisations to solve previously intractable problems at scale. Since launching Pasqal in 2019, we have directed our efforts towards concrete quantum computing algorithms immediately applicable to customer use cases. Through this agreement, we'll be at the forefront of accelerating commercial adoption of this transformative technology in Saudi Arabia. This isn't just any quantum computer; it will be the most powerful tool deployed for industrial usages, unlocking a new era of innovation for businesses and society.

The quantum computer will initially use an approach called analog mode. Within the following year, the system will be upgraded to a more advanced hybrid analog-digital mode, which is more powerful and able to solve even more complex problems.

Pasqal and Aramco intend to leverage the quantum computer to identify new use cases, and have an ambitious vision to establish a powerhouse for quantum research within Saudi Arabia. This would involve leading academic institutions with the aim of fostering breakthroughs in quantum algorithm development a crucial step for unlocking the true potential of quantum computing.

The agreement also accelerates Pasqal's activity in Saudi Arabia, having established an office in the Kingdom in 2023, and follows the signing of a Memorandum of Understanding between the companies in 2022 to collaborate on quantum computing capabilities and applications in the energy sector. In 2023, Aramco's Wa'ed Ventures also participated in Pasqal's Series B fundraising round.

Link:
Aramco signs agreement with Pasqal to deploy first quantum computer in the Kingdom of Saudi Arabia - Aramco