Archive for the ‘Quantum Computer’ Category

Post-Quantum Cryptography Standards Officially Announced by NIST a History and Explanation – SecurityWeek

Post-Quantum Cryptography Standards Officially Announced by NIST a History and Explanation  SecurityWeek

Read the rest here:
Post-Quantum Cryptography Standards Officially Announced by NIST a History and Explanation - SecurityWeek

Tags:

Study unveils limits on the extent to which quantum errors can be ‘undone’ in large systems – Phys.org

Study unveils limits on the extent to which quantum errors can be 'undone' in large systems  Phys.org

Read the original post:
Study unveils limits on the extent to which quantum errors can be 'undone' in large systems - Phys.org

Tags:

IQM Quantum Computers Achieves Technological Milestones With 99.9% 2-Qubit Gate Fidelity And 1 Millisecond Coherence Time – The Quantum Insider

Insider Brief

IQM Quantum Computers, a global leader in building quantum computers, has reached significant milestones in superconducting quantum computing, demonstrating improvements in two key metrics characterising the quality of quantum computer.

A record low error rate for two-qubit operations was achieved by demonstrating a CZ gate between two qubits with (99.91 +- 0.02) % fidelity, which was validated by interleaved randomised benchmarking. Achieving high two-qubit gate fidelity is the most fundamental and hardest to achieve characteristic of a quantum processor, essential for generating entangled states between qubits and executing quantum algorithms.

Furthermore, qubit relaxation time T1 of 0.964 +- 0.092 milliseconds and dephasing time T2 echo of 1.155 +- 0.188 milliseconds was demonstrated on a planar transmon qubit on a silicon chip fabricated inIQMs own fabrication facilities. The coherence times, characterised by the relaxation time T1 and the dephasing time T2 echo, are among the key metrics for assessing the performance of a single qubit, as they indicate how long quantum information can be stored in a physical qubit.

These major results show that IQMs fabrication technology has matured and is ready to support the next generation of IQMs high-performance quantum processors. The results followIQMs recent benchmark announcementsand indicate significant potential for further advancements on gate fidelities essential for fault-tolerant quantum computing and processors with higher qubit counts.

The improvements in the two characteristics, two-qubit gate fidelity and coherence time, allow the quantum computer to be developed for more complex use cases. The significance of these results stems from the fact that only very few organisations have achieved comparable performance numbers before.

The results were achieved through innovations in materials and fabrication technology and required top-notch performance across all components of the quantum computer, including QPU design, control optimisation, and system engineering.

This achievement cements our tech leadership in the industry. Our quantum processor quality is world-class, and these results show that we have a good opportunity of going beyond that,saidDr. Juha Hassel, theVice President of Engineering at IQM Quantum Computers.

Hassel explained that the company is on track with its technology roadmap and is actively exploring potential use cases in machine learning, cybersecurity, route optimisation, quantum sensor simulation, chemistry, and pharmaceutical research.

This announcement comes on the heels of the launch of Germanysfirst hybrid quantum computerat the Leibniz Supercomputing Centre in Munich, for which IQM led the integration with its 20-qubit quantum processing unit, and the opening of theIQM quantum data centrein Munich.

The rest is here:
IQM Quantum Computers Achieves Technological Milestones With 99.9% 2-Qubit Gate Fidelity And 1 Millisecond Coherence Time - The Quantum Insider

Think big: Computer the size of Suncorp Stadium to take shape near airport – Brisbane Times

According to the 2023 Queensland Quantum and Advanced Technologies Strategy, there will be 8700 jobs in quantum computing in Australia by 2030 in the fields of energy, decarbonisation, health and biotechnology, defence and aerospace.

The five universities are the University of Queensland, Griffith University, Queensland University of Technology, University of Southern Queensland and the University of the Sunshine Coast.

PsiQuantum, based in Palo Alto, California, was founded in 2016 by two UQ graduates, Jeremy OBrien and Terry Rudolph, while they worked at the University of Bristol.

A quantum computer is designed to solve complex problems in chemistry, maths and physics beyond the scope of conventional computers.

Quantum computers could revolutionise the development of drugs, materials and sustainable energy solutions, unlocking innovations that would otherwise remain unreachable.

Loading

UQ vice-chancellor Professor Deborah Terry said quantum physics impact on education would stretch from high schools into research.

Students starting high school this year will graduate into a world with utility-scale quantum computers, Terry said.

We will work with PsiQuantum across the education spectrum from schools, through TAFE, to universities to prepare our students for future jobs in quantum and advanced technologies.

Griffith Universitys vice-chancellor, Professor Carolyn Evans, said the consortium would be a new frontier for students.

Read more from the original source:
Think big: Computer the size of Suncorp Stadium to take shape near airport - Brisbane Times

Why every quantum computer will need a powerful classical computer – Ars Technica

Enlarge / A single logical qubit is built from a large collection of hardware qubits.

One of the more striking things about quantum computing is that the field, despite not having proven itself especially useful, has already spawned a collection of startups that are focused on building something other than qubits. It might be easy to dismiss this as opportunismtrying to cash in on the hype surrounding quantum computing. But it can be useful to look at the things these startups are targeting, because they can be an indication of hard problems in quantum computing that haven't yet been solved by any one of the big companies involved in that spacecompanies like Amazon, Google, IBM, or Intel.

In the case of a UK-based company called Riverlane, the unsolved piece that is being addressed is the huge amount of classical computations that are going to be necessary to make the quantum hardware work. Specifically, it's targeting the huge amount of data processing that will be needed for a key part of quantum error correction: recognizing when an error has occurred.

All qubits are fragile, tending to lose their state during operations, or simply over time. No matter what the technologycold atoms, superconducting transmons, whateverthese error rates put a hard limit on the amount of computation that can be done before an error is inevitable. That rules out doing almost every useful computation operating directly on existing hardware qubits.

The generally accepted solution to this is to work with what are called logical qubits. These involve linking multiple hardware qubits together and spreading the quantum information among them. Additional hardware qubits are linked in so that they can be measured to monitor errors affecting the data, allowing them to be corrected. It can take dozens of hardware qubits to make a single logical qubit, meaning even the largest existing systems can only support about 50 robust logical qubits.

Riverlane's founder and CEO, Steve Brierley, told Ars that error correction doesn't only stress the qubit hardware; it stresses the classical portion of the system as well. Each of the measurements of the qubits used for monitoring the system needs to be processed to detect and interpret any errors. We'll need roughly 100 logical qubits to do some of the simplest interesting calculations, meaning monitoring thousands of hardware qubits. Doing more sophisticated calculations may mean thousands of logical qubits.

That error-correction data (termed syndrome data in the field) needs to be read between each operation, which makes for a lot of data. "At scale, we're talking a hundred terabytes per second," said Brierley. "At a million physical qubits, we'll be processing about a hundred terabytes per second, which is Netflix global streaming."

It also has to be processed in real time, otherwise computations will get held up waiting for error correction to happen. To avoid that, errors must be detected in real time. For transmon-based qubits, syndrome data is generated roughly every microsecond, so real time means completing the processing of the datapossibly Terabytes of itwith a frequency of around a Megahertz. And Riverlane was founded to provide hardware that's capable of handling it.

The system the company has developed is described in a paper that it has posted on the arXiv. It's designed to handle syndrome data after other hardware has already converted the analog signals into digital form. This allows Riverlane's hardware to sit outside any low-temperature hardware that's needed for some forms of physical qubits.

That data is run through an algorithm the paper terms a "Collision Clustering decoder," which handles the error detection. To demonstrate its effectiveness, they implement it based on a typical Field Programmable Gate Array from Xilinx, where it occupies only about 5 percent of the chip but can handle a logical qubit built from nearly 900 hardware qubits (simulated, in this case).

The company also demonstrated a custom chip that handled an even larger logical qubit, while only occupying a tiny fraction of a square millimeter and consuming just 8 milliwatts of power.

Both of these versions are highly specialized; they simply feed the error information for other parts of the system to act on. So, it is a highly focused solution. But it's also quite flexible in that it works with various error-correction codes. Critically, it also integrates with systems designed to control a qubit based on very different physics, including cold atoms, trapped ions, and transmons.

"I think early on it was a bit of a puzzle," Brierley said. "You've got all these different types of physics; how are we going to do this?" It turned out not to be a major challenge. "One of our engineers was in Oxford working with the superconducting qubits, and in the afternoon he was working with the ion trap qubits. He came back to Cambridge and he was all excited. He was like, 'They're using the same control electronics.'" It turns out that, regardless of the physics involved in controlling the qubits, everybody had borrowed the same hardware from a different field (Brierley said it was a Xilinx radiofrequency system-on-a-chip built for 5G base stationed prototyping.) That makes it relatively easy to integrate Riverlane's custom hardware with a variety of systems.

See the original post:
Why every quantum computer will need a powerful classical computer - Ars Technica