Archive for the ‘Quantum Computer’ Category

Quantum supercomputing: IBM plots roadmap beyond Condor – ComputerWeekly.com

IBM has bolstered its supercomputing capabilities with the latest iteration of the companys quantum computer, Quantum System Two. Its the companys first modular quantum computer, and represents the cornerstone of IBMs quantum-centric supercomputing architecture.

The first IBM Quantum System Two, located in Yorktown Heights, New York, has begun operations with three IBM Heron processors and supporting control electronics.

We are firmly within the era in which quantum computers are being used as a tool to explore new frontiers of science, said Dario Gil, IBM senior vice-president and director of research. As we continue to advance how quantum systems can scale and deliver value through modular architectures, we will further increase the quality of a utility-scale quantum technology stack and put it into the hands of our users and partners, who will push the boundaries of more complex problems.

Following the companys quantum computing roadmap, IBM also unveiled Condor, a 1,121 superconducting qubit quantum processor based on what IBM calls cross-resonance gate technology.

According to IBM, Condor offers a 50% increase in qubit density and advances in qubit fabrication and laminate size, as well as over a mile of high-density cryogenic flex input/output wiring within a single dilution refrigerator. The new design is said to solve scale, and will be used to inform IBM on future hardware design.

Along with the new hardware, IBM unveiled an extension of its IBM Quantum Development Roadmap to 2033, where it plans to significantly advance the quality of gate operations. If it achieves its roadmap objectives, IBM said it will be able to increase the size of quantum circuits that can be run, which paves the way to realising the full potential of quantum computing at scale.

In a blog post giving an update on IBMs quantum computing plans, Jay Gambetta, vice-president of IBM Quantum, discussed experiments that demonstrate how quantum computers could run circuits beyond the reach of brute-force classical simulations. Quantum is now a computational tool, and what makes me most excited is that we can start to advance science in fields beyond quantum computing itself, he said.

But in the computational architecture Gambetta described, quantum technology will not run standalone. From these large-scale experiments, it has become clear that we must go beyond the traditional circuit model and take advantage of parallelism, concurrent classical computing and dynamic circuits, he said.

We have ample evidence that, with tools such as circuit knitting, we can enhance the reach of quantum computation, and new quantum algorithms are emerging that make use of multiple quantum circuits, potentially in parallel and with concurrent classical operations, said Gambetta. Its clear that a heterogeneous computing architecture consisting of scalable and parallel circuit execution and advanced classical computation is required.

This, he said, is IBMs vision for future high-performance systems, which he described as quantum-centric supercomputing.

Read more from the original source:
Quantum supercomputing: IBM plots roadmap beyond Condor - ComputerWeekly.com

New quantum chip, modular computer and SDK revealed by IBM – The Stack

IBM has revealed a new utility scale quantum processor, a landmark modular quantum computer, and teased the coming release of Qiskit 1.0 a significantly improved open source software development kit to build powerful quantum computing qubit circuits with comparative ease.

Extending its quantum computing roadmap out to 2033 meanwhile, Big Blue pledged to release a Blue Jay, a system capable of executing 1 billion gates across 2,000 qubits by 2033 a nine order-of-magnitude increase in performed gates since we put our first device on the cloud in 2016.

The trio of releases, made at the annual IBM Quantum Summit in New York, come six months after the company said it successfully worked around the quantum noise that introduces errors in calculations, to get reliable results at a scale beyond brute-force classical computation detailing that progress in a paper published in the journal Nature.

The techniques that enabled this represent a foundational tool for the realization of near-term quantum applications IBM said in June 2023.

Classical computing deploys bits that use the 0 and 1 vocabulary of binary code. Quantum computers use qubits that draw on two-state quantum-mechanical systems the ability of quantum particles to be in superposition; two different states at the same time.

As IBM Researchs Edwin Pednault puts it: A qubit can represent both 0 and 1 simultaneously in fact, in weighted combinations; for example, 37%-0, 63%-1. Three qubits can represent 2^3, or eight values simultaneously: 000, 001, 010, 011, 100, 101, 110, 111; 50 qubits can represent over one quadrillion values simultaneously.

Whilst classical computing circuits use ANDs and ORs and NOTs and XORs (binary gates) on which users build up higher level instructions, then support for languages like Java, Python, etc., quantum computers use different kinds of gates like CNOTs and Hadamards.

For quantum computing to work effectively, calculations need to keep going in superposition for the duration of the computational cycle.But they can easily be thrown off by noise ( the central obstacle to building large-scale quantum computers) which could stem from diverse sources including disturbances in Earths magnetic field, local radiation, cosmic rays, or the influence that qubits exert on each other by proximity.

This is in part tackled physically: signals for configuring and programming a quantum computer come from outside the machines, travel down coaxial cables where they are amplified and filtered, and eventually reach the quantum device with its qubits at ~0.015K (-273.135 degrees C) and noise tackled by minimising the exposure of the chips and cables to heat and electromagnetic radiation in all its forms, by minimizing device defects, by constantly improving the performance of the electronics, and by using all sorts of novel mathematical schemes to compensate for noise.

The Stack reviewed the three new releases and associated academic papers for our readers to distil precisely what IBM has/aims to achieve, as Dario Gil, IBM SVP and Director of Research pledged on December 4 to further increase the quality of a utility-scale quantum technology stack.

At the heart of its IBM Quantum System Two, a new modular quantum computer and cornerstone of IBM's quantum-centric supercomputing architecture is the new Quantum Heron 133-qubit processor.(This summers quantum achievements highlighted above were made on IBMs previous generation of semiconductor, its Quantum Eagle.)

The Quantum Heron offers a five-times improvement over the previous records set by IBM Eagle when it comes to reducing errors, IBM said. It is making the new chips available for users today via the cloud with more of the chips to join a utility-scale fleet of systems over the next year.

Featuring 133 fixed-frequency qubits with tunable couplers, Heron yields a 3-5x improvement in device performance over its 127-qubit Eagle processors, and virtually eliminates cross-talk IBMs Gil said, adding we have developed a qubit and the gate technology that were confident will form the foundation of our hardware roadmap going forward.

(A coupler helps determine the performance of a superconducting quantum computer. Tunable couplers link qubits and perform quantum computations by turning on and off the coupling between them.)

The chip is built with whats known as a heavy-hex processor architecture in which each unit cell of the lattice consists of a hexagonal arrangement of qubits, with an additional qubit on each edge.

As analyst Paul Smith-Goodson notes: The Herons modular architectureis different from previous quantum processor architecture.

The new architecture connects quantum processors to a common control infrastructure so that data can flow classically and in real time between the QPU and other chips in a multi-chip environment.

It also uses a new multi-qubit gate scheme that is both faster and provides higher fidelity. The Heron is the first IBM chip to use the new architecture that allows multiple processors to be linked using classical couplers to permit classical parallelization he added.

The new modular IBM Quantum System Two meanwhile combines what Big Blue described as scalable cryogenic infrastructure and classical runtime servers with modular qubit control electronics. As the building block for IBM's quantum computing roadmap, it will house IBM's future generations of quantum processors and be accessible via the cloud.

The system gets updated middleware too and after six years of development, IBM is gearing up for the release of Qiskit 1.0 early in Q1 2024. (Qiskit is an open-source SDK with extensive documentation for both the hardware and software layer and for working with quantum computers at the level of circuits, pulses, and algorithms that ships with has several domain specific application APIs on top of its core module.)

IBM touted what it described as a stable Qiskit focused on programming with Patterns, plus new set of tools using AI to help write and optimize Qiskit and QASM3 code the beta release of Quantum Serverless on the IBM Quantum Platform, to facilitate run remote execution Qiksit Patterns, in a quantum function style lets unpack this quantum verbiage!

A stable Qiskit is self-explanatory: After six years as a core SDK Qiskit has become what IBM describes as the lingua franca of quantum computing allowing programmers to write circuits, then execute them on hardware from more than eight different hardware manufacturers.

The 1.0 release adds stability, major improvements in memory footprint of circuits a claimed 55% decrease in memory usage compared to summer 2022s Qiskit 0.39 for example, and other improvements.

Qiskit patterns meanwhile are a collection of tools to simply map classical problems, optimize them to quantum circuits using Qiskit, executing those circuits using Qiskit Runtime, and then postprocess the results the release of a serverless execution option means users wont have to sit and wait over a stable network whilst a job is queued and executed but punt it out for managed execution, leave, and come back when the results are ready for you; combined, IBM thinks that these will democratise access to quantum computing and mean end-users do not [need to] be fluent in quantum circuits toutilize quantum computing.

Quantum computing, of course, is not immune to the allure of LLMs and IBM is also shipping a generative AI code assistant called Qiskit Code Assistant, based on the IBM Granite 20-billion-parameter Code Qiskit model, which was trained with about 370 million text tokens, based on a large collection of Qiskit examples and designed to remove some of the heavy lifting for programmers as they explore the suite of new tools.

Qubits, meanwhile, remain some distance from being the go-to solution for traditional computational problems, but IBM has and continues to be a genuine trail-blazer in the quantum computing space and as this summer's research showed, is making significant progress. A tipping point will arrive and then, the world will likely change. Those interested in exploring the shape of things to come could do worse than start with Qiskit.

Read the original:
New quantum chip, modular computer and SDK revealed by IBM - The Stack

Perceval Tech note: Introducing shots in a linear optic quantum computing framework – Medium

Article by Raksha Singla

The intrinsic characteristics of quantum computer hardware introduce a probabilistic element to their behavior. Youve likely encountered this explanation repeatedly, and it essentially signifies that when an input state undergoes processing in a quantum circuit, the resulting output state is very well defined by state evolution formula, but its measurement is probabilistic. Instead of getting access to the result of the calculation, as one would expect in classical computing, one obtain one measurement of this output state, the specific measurement following a probabilistic distribution obtained from the quantum state through what is known as the Born rule. For a software developer, this feature can be seen as a major bottleneck compared to classical computation, but on the other hand, this feature is actually giving the developer an indirect access to the huge computation space of quantum states marking a revolutionary shift in the computing landscape and enabling solving hard problems for classical computing.

A shot represents the execution of a quantum circuit and the corresponding data collected by the hardware at the output during that single run. Given the probabilistic nature of the system, conducting multiple iterations of the system (obtaining many shots) is necessary to gather data for statistical analysis of the algorithms operation.

The concept of shots is universally embraced by most quantum providers and forms an essential element in any quantum algorithm. Its implementation and management can vary among different frameworks, depending on the specific characteristics of the hardware system and the compute space.

How do we define shots at Quandela?

Our computing architecture works with linear optical elements forming the quantum circuit performing the processing action on states of photons (input source) in the Fock space (for an explanation of the Fock space see here). Detection of 1 or multiple photons by the detectors at the output define a single execution of a quantum experiment or an act of computing corresponding to the processing of data encoded into the output photons.

Our source is a single photon source emitting photons at a fixed rate into the chip implementing the circuit designed by a user. These photons may undergo absorption at various points within the hardware. Nevertheless, whenever at least one photon is detected at the optical chips output (termed as a photon-coincidence event), it marks the end of a single execution and the measured output constitutes a data sample. The count of such occurrences while running an algorithm represents our shot count.

A user may not necessarily want to sample single photon detections; they may specifically desire samples with a certain number N (>1) of photon coincidences and request these as the output. In such cases, the system may need to be run with a number of shots exceeding the requested number of samples, as multiple photon coincidences are anticipated. Recognizing this user preference, we have incorporated a tool in Perceval to estimate the necessary number of shots based on the users desired sample count. This tool conducts the estimation by considering the unitary of the implemented circuit, the input state, and the performance characteristics of the device.

How Shots will Revolutionize our Users Experience ?

Access to Quantum State Distribution:

In the light of the definition that characterizes Shots as the output detected during each circuit execution, they offer direct access to the probability distribution of a quantum state.

Predictible output rate:

In a photonic system characterized by instantaneous gate applications and a complex input state preparation timing (see detail on demultiplexing here), the time capture of shots clearly indicating the end of a single execution is exhibiting variability attributed to this input state time sequence, the actual configured circuit, and system transmittance factor. Working with shots guarantee a predictable output rate independent of these fluctuations.

Simplified User Interactions:

The incorporation of shots not only seeks to standardize user interactions with running algorithms on our Quantum Processing Unit (QPU) through our cloud services but also provides them with a more standardized parameter for understanding their resource needs. This enhancement contributes to a clearer and more consistent measure.

Predictability for Time and Cost:

Shots, being highly predictable, offer the most reliable means to estimate the time and cost of running an algorithm.

This stability in parameter counting results in fixed pricing, ensuring fairness to users and independence from the variability of the performance of the physical QPU device.

The rest is here:
Perceval Tech note: Introducing shots in a linear optic quantum computing framework - Medium

Quantum computers: IBM reveals Condor and Heron quantum CPUs – Notebookcheck.net

The Heron QPU with 133 qubits (Source: IBM Research/Flickr)

IBM recently presented two new quantum processors (QPUs) at the Quantum Summit 2023: the Condor QPU with 1121 qubits and the smaller Heron QPU with 133 qubits. IBM also unveiled the modular System Two quantum supercomputer.

Just a year ago, technology giant IBM introduced quantum processorOsprey,which runs at 433 qubits. This has now been succeeded by the IBM Condor, which clocks at an impressive 1121 qubits. While the new Quantum Processing Unit (QPU) from IBM is forced to concede defeat to Atom Computing with 1180 qubits, it has nonetheless managed to increase the qubit density by more than 50% compared to its in-house predecessor.

At first glance, the second QPU, which IBM recently presented on December 4 at the Quantum Summit 2023,appears less groundbreaking: The Heron quantum processor has 133 qubits and suceeds its Eagle predecessor, which had 127 qubits. While the IBM Condor QPU is primarily used to research how many qubits can fit on a quantum processor, the IBM Quantum Heron will be used in the modular quantum computer System Two.

IBM's first System Two, with three Heron QPUs, is reportedly already in operation at IBM's Yorktown Heights research site in the USA. Although the Heron quantum processor only offers 6 qubits more than its Eagle predecessor, the new 133-qubit QPU is said to deliver a three- to fivefold increase in performance over the Eagle QPU because quantum crosstalk has been virtually eliminated.

The IBM Quantum System Two in Yorktown Heights is 22 feet wide and 12 feet high, which is approximately 6.7 by 3.7 meters. This first System Two contains three Heron QPUs and computes in a near-perfect vacuum at temperatures lower than in space, namely below -270 degrees Celsius. This means that System Two presumably operates under conditions close to absolute zero at -273.15 degrees Celsius.

Growing up in regional Australia, I first became acquainted with computers in my early teens after a broken leg from a football (soccer) match temporarily condemned me to a predominately indoor lifestyle. Soon afterwards I was building my own systems. Now I live in Germany, having moved here in 2014, where I study philosophy and anthropology. I am particularly fascinated by how computer technology has fundamentally and dramatically reshaped human culture, and how it continues to do so.

Please share our article, every link counts!

Read this article:
Quantum computers: IBM reveals Condor and Heron quantum CPUs - Notebookcheck.net

A Physicist Reveals the One Quantum Breakthrough That Could Disrupt Scientific Innovation – Inverse

Quantum advantage is the milestone the field of quantum computing is fervently working toward, where a quantum computer can solve problems that are beyond the reach of the most powerful non-quantum or classical computers.

Quantum refers to the scale of atoms and molecules where the laws of physics as we experience them break down, and a different, counterintuitive set of laws apply. Quantum computers take advantage of these strange behaviors to solve problems.

There are some types of problems that are impractical for classical computers to solve, such as cracking state-of-the-art encryption algorithms. Research in recent decades has shown that quantum computers have the potential to solve some of these problems. If a quantum computer can be built that actually does solve one of these problems, it will have demonstrated quantum advantage.

I am a physicist who studies quantum information processing and the control of quantum systems. I believe that this frontier of scientific and technological innovation not only promises groundbreaking advances in computation but also represents a broader surge in quantum technology, including significant advancements in quantum cryptography and quantum sensing.

Central to quantum computing is the quantum bit, or qubit. Unlike classical bits, which can only be in states of 0 or 1, a qubit can be in any state that is some combination of 0 and 1. This state of neither just one nor just 0 is known as a quantum superposition. With every additional qubit, the number of states that can be represented by the qubits doubles.

This property is often mistaken for the source of the power of quantum computing. Instead, it comes down to an intricate interplay of superposition, interference, and entanglement.

Interference involves manipulating qubits so that their states combine constructively during computations to amplify correct solutions and destructively suppress the wrong answers. Constructive interference is what happens when the peaks of two waves like sound waves or ocean waves combine to create a higher peak. Destructive interference is what happens when a wave peak and a wave trough combine and cancel each other out. Quantum algorithms, which are few and difficult to devise, set up a sequence of interference patterns that yield the correct answer to a problem.

Entanglement establishes a uniquely quantum correlation between qubits: The state of one cannot be described independently of the others, no matter how far apart the qubits are. This is what Albert Einstein famously dismissed as spooky action at a distance. Entanglements collective behavior, orchestrated through a quantum computer, enables computational speed-ups that are beyond the reach of classical computers.

Quantum computing has a range of potential uses where it can outperform classical computers. In cryptography, quantum computers pose both an opportunity and a challenge. Most famously, they have the potential to decipher current encryption algorithms, such as the widely used RSA scheme.

One consequence of this is that todays encryption protocols need to be re-engineered to be resistant to future quantum attacks. This recognition has led to the burgeoning field of post-quantum cryptography. After a long process, the National Institute of Standards and Technology recently selected four quantum-resistant algorithms and has begun the process of readying them so that organizations around the world can use them in their encryption technology.

In addition, quantum computing can dramatically speed up quantum simulation: the ability to predict the outcome of experiments operating in the quantum realm. Famed physicist Richard Feynman envisioned this possibility more than 40 years ago. Quantum simulation offers the potential for considerable advancements in chemistry and materials science, aiding in areas such as the intricate modeling of molecular structures for drug discovery and enabling the discovery or creation of materials with novel properties.

Another use of quantum information technology is quantum sensing: detecting and measuring physical properties like electromagnetic energy, gravity, pressure, and temperature with greater sensitivity and precision than non-quantum instruments. Quantum sensing has myriad applications in fields such as environmental monitoring, geological exploration, medical imaging, and surveillance.

Initiatives such as the development of a quantum internet that interconnects quantum computers are crucial steps toward bridging the quantum and classical computing worlds. This network could be secured using quantum cryptographic protocols such as quantum key distribution, which enables ultra-secure communication channels that are protected against computational attacks including those using quantum computers.

Despite a growing application suite for quantum computing, developing new algorithms that make full use of the quantum advantage in particular in machine learning remains a critical area of ongoing research.

The quantum computing field faces significant hurdles in hardware and software development. Quantum computers are highly sensitive to any unintentional interactions with their environments. This leads to the phenomenon of decoherence, where qubits rapidly degrade to the 0 or 1 states of classical bits.

Building large-scale quantum computing systems capable of delivering on the promise of quantum speed-ups requires overcoming decoherence. The key is developing effective methods of suppressing and correcting quantum errors, an area my own research is focused on.

In navigating these challenges, numerous quantum hardware and software startups have emerged alongside well-established technology industry players like Google and IBM. This industry interest, combined with significant investment from governments worldwide, underscores a collective recognition of quantum technologys transformative potential. These initiatives foster a rich ecosystem where academia and industry collaborate, accelerating progress in the field.

Quantum computing may one day be as disruptive as the arrival of generative AI. Currently, the development of quantum computing technology is at a crucial juncture. On the one hand, the field has already shown early signs of having achieved a narrowly specialized quantum advantage. Researchers at Google and later a team of researchers in China demonstrated a quantum advantage for generating a list of random numbers with certain properties. My research team demonstrated a quantum speed-up for a random number guessing game.

On the other hand, there is a tangible risk of entering a quantum winter, a period of reduced investment if practical results fail to materialize in the near term.

While the technology industry is working to deliver quantum advantage in products and services in the near term, academic research remains focused on investigating the fundamental principles underpinning this new science and technology. This ongoing basic research, fueled by enthusiastic cadres of new and bright students of the type I encounter almost every day, ensures that the field will continue to progress.

This article was originally published on The Conversation by Daniel Lidar at the University of Southern California. Read the original article here.

See original here:
A Physicist Reveals the One Quantum Breakthrough That Could Disrupt Scientific Innovation - Inverse