Archive for the ‘Quantum Computer’ Category

Quantum error correction used to actually correct errors – Ars Technica

Enlarge / Quantinuum's H2 "racetrack" quantum processor.

Quantinuum

Today's quantum computing hardware is severely limited in what it can do by errors that are difficult to avoid. There can be problems with everything from setting the initial state of a qubit to reading its output, and qubits will occasionally lose their state while doing nothing. Some of the quantum processors in existence today can't use all of their individual qubits for a single calculation without errors becoming inevitable.

The solution is to combine multiple hardware qubits to form what's termed a logical qubit. This allows a single bit of quantum information to be distributed among multiple hardware qubits, reducing the impact of individual errors. Additional qubits can be used as sensors to detect errors and allow interventions to correct them. Recently, there have been a number of demonstrations that logical qubits work in principle.

On Wednesday, Microsoft and Quantinuum announced that logical qubits work in more than principle. "We've been able to demonstrate what's called active syndrome extraction, or sometimes it's also called repeated error correction," Microsoft's Krysta Svoretold Ars. "And we've been able to do this such that it is better than the underlying physical error rate. So it actually works."

Microsoft has its own quantum computing efforts, and it also acts as a service provider for other companies' hardware. Its Azure Quantum service allows users to write instructions for quantum computers in a hardware-agnostic manner and then run them on the offerings of four different companies, many of them based on radically different hardware qubits. This work, however, was done on one specific hardware platform: a trapped-ion computer from a company called Quantinuum.

We covered the technology behind Quantinuum's computers when the company was an internal project at industrial giant Honeywell. Briefly, trapped ion qubits benefit from a consistent behavior (there's no device-to-device variation in atoms), ease of control, and relative stability. Because the ions can be moved around easily, it's possible to entangle any qubit with any other in the hardware and to perform measurements on them while calculations are in progress. "These are some of the key capabilities: the two-qubit gate fidelities, the fact that you can move and have all the connectivity through movement, and then mid-circuit measurement," Svore told Ars.

Quantinuum's hardware does lag in one dimension: the total number of qubits. While some of its competitors have pushed over 1,000 qubits, Quantinuum's latest hardware is limited to 32 qubits.

That said, a low error rate is valuable for this work. Logical qubits work by combining multiple hardware qubits. If each of those qubits has a high enough error rate, combining them increases the probability that errors will crop up more quickly than they can be corrected. So the error rate has to be below a critical point for error correction to work. And existing qubit technologies seem to be at that pointalbeit barely. Initial work in this area had either barely detected the impact of error correction or had simply registered the errors but not corrected them.

As the draft of a new manuscript describing this work puts it, "To the best our knowledge, none of these experiments have demonstrated logical error rates better than the physical error rates."

Microsoft is also well-positioned to be doing this work. Its role requires it to translate generic quantum code into the commands needed to be performed on Quantinuum's hardwareincluding acting as a compiler provider. And in at least part of this work, it used this knowledge to specifically optimize the code to cut down on the time spent moving ions around.

The work involved three experiments. In the first, the researchers formed a logical qubit with seven information-holding hardware qubits and three ancillary qubits for error detection and correction. The 32 qubits in the hardware allowed two of these to be created; they were then entangled, which required two gate operations. Errors were checked for during the initialization of the qubits and after the entanglement. These operations were performed thousands of times to derive error rates.

On individual hardware qubits, the error rate was 0.50 percent. When error correction was included, this rate dropped to 0.05 percent. But the system could do even better if it identified readings that indicated difficult-to-interpret error states and discarded those calculations. Doing the discarding dropped the error rate to 0.001 percent. These instances were rare enough that the team didn't have to throw out a significant number of operations, but they still made a huge difference in the error rate.

Next, the team switched to what they call a "Carbon code," which requires 30 physical qubits (24 data and six correction/detection), meaning the hardware could only host one. But the code was also optimized for the hardware. "Knowing the two-qubit gate fidelities, knowing how many interaction zones, how much parallelism you can have, we then optimize our error-correction codes for that," Svore said.

The Carbon code also allows the identification of errors that are difficult to correct properly, allowing those results to be discarded. With error correction and discarding of difficult-to-fix errors, the error rate dropped from 0.8 percent to 0.001 percenta factor of 800 difference.

Finally, the researchers performed repeated rounds of gate operations followed by error detection and correction on a logical qubit using the Carbon code. These again showed a major improvement thanks to error correction (about an order of magnitude) after one round. By the second round, however, error correction had only cut the error rate in half, and any effect was statistically insignificant by round three.

So while the results tell us that error correction works, they also indicate that our current hardware isn't yet sufficient to allow for the extended operations that useful calculations will require. Still, Svore said, "I think this marks a critical milestone on the path to more elaborate computations that are fault tolerant and reliable" and emphasized that it was done on production commercial hardware rather than a one-of-a-kind academic machine.

Read this article:
Quantum error correction used to actually correct errors - Ars Technica

Redefining Quantum Communication: Researchers Have Solved a Foundational Problem in Transmitting Quantum … – SciTechDaily

Researchers from the Institute of Industrial Science, The University of Tokyo have solved a foundational problem in transmitting quantum information, which could dramatically enhance the utility of integrated circuits and quantum computing. Credit: Institute of Industrial Science, The University of Tokyo

Quantum electronics represents a significant departure from conventional electronics. In traditional systems, memory is stored in binary digits. In contrast, quantum electronics utilizes qubits for storage, which can assume various forms, including electrons trapped in nanostructures known as quantum dots. Nonetheless, the ability to transmit information beyond the adjacent quantum dot poses a substantial challenge, thereby limiting the design possibilities for qubits.

Now, in a study recently published in Physical Review Letters, researchers from the Institute of Industrial Science at the University of Tokyo are solving this problem: they developed a new technology for transmitting quantum information over perhaps tens to a hundred micrometers. This advance could improve the functionality of upcoming quantum electronics.

How can researchers transmit quantum information, from one quantum dot to another, on the same quantum computer chip? One way might be to convert electron (matter) information into light (electromagnetic wave) information: by generating lightmatter hybrid states. Previous work has been incompatible with the one-electron needs of quantum information processing. Improving on high-speed quantum information transmission in a way that is more flexible in design and is compatible with the semiconductor fabrication tools that are currently available was the goal of the research teams study.

In our work, we couple a few electrons in the quantum dot to an electrical circuit known as a terahertz split-ring resonator, explains Kazuyuki Kuroyama, lead author of the study. The design is simple and suitable for large-scale integration.

Previous work has been based on coupling the resonator with an ensemble of thousands to tens of thousands of electrons. In fact, the coupling strength is based on the large size of this ensemble. In contrast, the present system confines only a few electrons, which is suitable for quantum information processing. Nevertheless, both electrons and terahertz electromagnetic waves are confined to an ultra-small area. Therefore, the coupling strength is comparable in strength to that of many-electron systems.

Were excited because we use structures that are widespread in advanced nanotechnology and are commonly integrated into semiconductor manufacturing to help solve a practical quantum information transmission problem, says Kazuhiko Hirakawa, senior author. We also look forward to applying our findings to understanding the fundamental physics of lightelectron coupled states.

This work is an important step forward in solving a previously vexing problem in transmitting quantum information that has limited applications of laboratory findings. In addition, such lightmatter interconversion is regarded as one of the essential architectures for large-scale quantum computers based on semiconductor quantum dots. Because the researchers results are based on materials and procedures that are common in semiconductor manufacturing, practical implementation should be straightforward.

Reference: Coherent Interaction of a Few-Electron Quantum Dot with a Terahertz Optical Resonator by Kazuyuki Kuroyama, Jinkwan Kwoen, Yasuhiko Arakawa and Kazuhiko Hirakawa, 9 February 2024, Physical Review Letters. DOI: 10.1103/PhysRevLett.132.066901

Follow this link:
Redefining Quantum Communication: Researchers Have Solved a Foundational Problem in Transmitting Quantum ... - SciTechDaily

Quantum Computing Leaps Forward with Groundbreaking Error Correction – yTech

In a significant advancement for quantum computing, Microsoft and Quantinuum have announced a major milestone which might represent the most stable quantum capabilities observed so far. Microsofts approach allows a quantum computer to self-correct, achieving an unprecedented level of reliability with no errors across thousands of tests.

The essence of quantum computing comes from its basic unit, the qubit, which offers the potential to handle complex calculations at speeds incomprehensible to traditional computers. However, qubits are also prone to errors due to environmental factors. To address this, error-correction techniques are essential, and Microsoft and Quantinuum have made headway in this domain.

Microsoft has developed an innovative algorithm capable of correcting qubit-generated errors in Quantinuums system, resulting in a dramatically reduced error rate. By converting 30 qubits into four highly reliable logical qubits, not only did they demonstrate a notable decline in error occurrence, but the logical qubits even had the resilience to correct any arising issues without being compromised.

This advancement, while impressive, is only a stepping stone, as the real-world applications of quantum computing will require over a hundred logical qubits. The outcomes of this experiment are yet to be scrutinized by the larger scientific community, but they inject optimism into quantum research, indicating that practical quantum computing is drawing closer.

This collaboration between Microsoft and Quantinuum is pushing the boundaries of the quantum ecosystem and may soon revolutionize fields from scientific research to energy security, embodying a landmark in the evolution of computing technology.

Quantum Computing: Industry Insights and Market Forecasts

Quantum computing represents a transformative leap in computational capabilities, offering the promise of solving complex problems far beyond the reach of current supercomputers. This emerging industry is characterized by its potential to revolutionize various fields, including cryptography, materials science, pharmaceuticals, and finance, by performing calculations at unprecedented speeds.

Market forecasts suggest that the quantum computing industry is on a trajectory of rapid expansion. According to recent research, the global quantum computing market is expected to grow substantially over the next decade, attributed to increased investments from both private and public sectors, advancements in quantum algorithms and error correction, and a growing demand for solving complex computational problems. The financial investment in quantum computing research and development is significant, with tech giants and startups alike racing to achieve breakthroughs that could grant them an edge in this potentially lucrative market.

Overcoming Industry Challenges

Despite the significant advancements made by Microsoft and Quantinuum, the quantum computing industry faces multiple challenges. One of the most prominent is achieving scalable error correction, which is necessary to build practical and reliable quantum computers. The successful error-correcting algorithm developed by Microsoft addresses one part of this complex puzzle, yet scaling up to a large number of logical qubits without incurring prohibitive costs or excessive complexity remains a technical hurdle.

Temperature control is another issue, as quantum processors need to be kept at extremely low temperatures to minimize environmental disturbances. Additionally, the coherence time, or the duration for which qubits maintain their quantum state, is a key factor that needs to be extended to allow for more complex and extended computations.

Protecting quantum information against decoherence and maintaining robustness against errors are critical focus areas for researchers. As the technology matures, the industry will also have to tackle broader issues such as standardization, establishing quantum-safe security protocols, and developing a skilled workforce capable of pushing the boundaries of quantum computer science.

Revolutionizing Fields and Future Potential

The potential applications of quantum computing are vast, and the improvements in error correction shown by Microsoft and Quantinuum are significant steps towards unlocking this potential. In healthcare, for example, quantum computing could enable the design of more effective drugs by accurately simulating complex molecules. In finance, quantum algorithms could optimize portfolios by evaluating countless scenarios simultaneously. For climate and energy, quantum computers may model new materials for better solar cells or more efficient batteries, contributing to sustainable energy solutions.

With industry leaders like Microsoft and their partners demonstrating a more stable quantum future, the practical application within these fields becomes increasingly feasible, ushering in a new era of innovation and discovery. The benefits of quantum computing will only be fully realized once the technology becomes widely accessible, leading to a paradigm shift in the way we approach and solve the worlds most challenging problems.

For further reading and staying updated on the progress of the quantum computing industry, you may wish to visit the websites of leading tech companies and research institutions. Links to a few of them are provided below:

IBM Google Intel Honeywell

Please keep in mind when exploring these resources that the quantum computing landscape is rapidly evolving, and new advancements or collaborations could emerge at any point.

Leokadia Gogulska is an emerging figure in the field of environmental technology, known for her groundbreaking work in developing sustainable urban infrastructure solutions. Her research focuses on integrating green technologies in urban planning, aiming to reduce environmental impact while enhancing livability in cities. Gogulskas innovative approaches to renewable energy usage, waste management, and eco-friendly transportation systems have garnered attention for their practicality and effectiveness. Her contributions are increasingly influential in shaping policies and practices towards more sustainable and resilient urban environments.

Originally posted here:
Quantum Computing Leaps Forward with Groundbreaking Error Correction - yTech

It’s time for colos and wholesalers to start looking at quantum computing customers – DatacenterDynamics

Were not there yet, but we could be on the cusp of quantum computers being deployed in data centers alongside live customer hardware. But with a step-change in computing on the horizon, data center operators should start looking at the reality of deploying quantum computers in live environments.

While still a nascent technology, quantum computers have the potential to revolutionize computing.

Through complicated quantum mechanics, these systems potentially offer a route for supercomputing power on a scale never seen before in a much smaller and more efficient footprint than traditional silicon-based systems (known as classical computing).

Wholesale and colocation data center providers may be long used to hosting traditional supercomputers. Increased power and cooling requirements aside, even the most powerful supercomputing hardware is still designed and built to conform to todays data center environments when it comes to size, shape, operation, and maintenance.

Quantum computing, however, represents an entirely different paradigm and the foibles of quantum technologies mean just asking about racks and density wont cut it.

Today, almost all quantum computers sit within dedicated science labs.

However, as quantum computers grow in power, interest in deploying these systems is growing. Enterprises are looking at using them in their operations for real-world applications, while quantum startups are looking to deploy systems in data centers close to potential customers existing IT hardware. Neither yet have experience of what that means in the real world outside of a controlled lab environment.

Creating the environments necessary for quantum effects to take place requires wildly different technologies and supporting infrastructure than classical computing yet still have to be located close to classical systems in order to allow the flow of data from one to the other. This presents challenges to companies who have never had to accommodate such novel technology in live commercial spaces often in halls occupied by multiple customers.

Due to limitations of physics, the powerful quantum computers that might surpass todays classical supercomputers and offer what is known as quantum supremacy are much larger and are highly unlikely to be able to fit within traditional form factors data center operators are used to handling. The sensitive and precise nature of the technology means quantum computers must be far away and isolated enough from existing systems not to suffer interference from existing IT hardware, but still be close enough to integrate with these systems.

Many quantum computers rely on supercooling via liquid nitrogen and hydrogen-3 two difficult-to-handle and potentially hazardous liquids that have never been previously used within data center environments. The housing for supercooled systems is also not what youd call rack-friendly; the largest Kide system from supplier Blufors can cool around 1,000 qubits: the system is just under three meters in height and 2.5 meters in diameter, and the floor beneath it needs to be able to take about 7,000 kilograms of weight.

Other quantum technologies rely on high-powered lasers and sensitive optical technologies. Though not as tall or heavy as supercooled systems, these optical table-based machines have much larger footprints and need to be mechanically isolated.

The change is coming. Oxford Quantum Circuits has already deployed two quantum computers in colocation data centers; one in the UK in a Cyxtera facility outside London, and an Equinix facility in Tokyo, Japan. The Cyxtera deployment located in a former tape library required modifications to the ceiling, safety measures, and operation procedures.

OQC has said it is looking at more colo deployments in the future, and now the dam has broken, more quantum companies will no doubt look to follow suit.

GPU-heavy AI cloud providers like CoreWeave have been major leasers of data center capacity over the last 12 months as the demand for AI-centric hardware spikes. Many operators are now offering combined liquid- and air-cooling facility designs to accommodate these designs, while a few are going all-in on immersion cooling.

A similar model opportunity for colocation/wholesale providers could well pop up once quantum computers become more readily available with higher qubit counts and production-ready use cases. But quantum computers will need their own accommodations on top of the requirements of high-density AI, often in the same building.

Quantum computing providers are all offering access to quantum computers hosting in their on-premise labs through the cloud. But as demand for access increases and these companies begin building more systems at greater scale, they could well need more dedicated space.

Some quantum providers told DCD they are already in discussions with some colocation/wholesale providers and exploring what dedicated space in a colo facility may look like. Others have told us theyd rather rely on the uptime expertise of established providers than try to learn how to build and operate SLA-bound facilities themselves on top of building physics-bending computers.

If they dont want to be left behind, data center operators should start looking at, and understanding, the unique requirements these potentially disruptive machines will have on current operations; how they are currently accommodated; and the issues that will need consideration in the near future if these machines are adopted at any significant scale.

Read more from the original source:
It's time for colos and wholesalers to start looking at quantum computing customers - DatacenterDynamics

Microsoft and Quantinuum boast quantum computing breakthrough – DIGIT.FYI

Microsoft and Quantinuum, a quantum computing firm, have claimed to reach a seminal step in quantum computing, in what could be the most reliable quantum capabilities yet to be seen.

The machine boasts the ability to correct itself, using Microsofts qubit-virtualisation system Microsoft says it ran the computer on 14,000 individual experiments without a single error.

Quantum computers can solve computational problems that could take millions of years to solve on a traditional silicon-based computer, with unprecedented speeds.

But quantum relies on qubits as their fundamental component, which, despite their speed, can produce many errors if the environment is not optimal. To combat this, quantum computers often have error-correction techniques built in so that more reliable results are produced.

Breakthroughs in quantum error correction and fault tolerance are important for realising the long-term value of quantum computing for scientific discovery and energy security, Dr Travis Humble, director of of the Quantum Science Centre at the Oak Ridge National Laboratory said.

Microsoft researchers wrote an algorithm to correct the errors produced by Quantinuums qubits, resulting in the largest gap between physical and logical error rates reported to date, Microsoft announced.

From 30 qubits, researchers were able to retain four logical qubits, which could generate solutions and errors that could be fixed without the qubits being destroyed.

The error rate of these four logical qubits were also 800 times lower than the error rate of the physical qubits.

Todays results mark a historic achievement and are a wonderful reflection of how this collaboration continues to push the boundaries for the quantum ecosystem, Ilyas Khan, founder and chief product officer at Quantinuum said.

With Microsofts state-of-the-art error correction aligned with the worlds most powerful quantum computer and a fully integrated approach, we are so excited for the next evolution in quantum applications and cant wait to see how our customers and partners will benefit from our solutions especially as we move towards quantum processors at scale.

The major step has yet to be investigated by the wider scientific community however. Further, quantum computers will likely need 100 ore more logical qubits to tackle the most relevant scientific problems currently facing us. Still, the results are promising to wider quantum computing research.

Read more here:
Microsoft and Quantinuum boast quantum computing breakthrough - DIGIT.FYI