Archive for the ‘Quantum Computer’ Category

QuantaMap raises 1.4 million to overcome hurdle in quantum – Innovation Origins

Dutch startup QuantaMap has secured 1.4 million in funding to improve the production of quantum computer chips, using a special microscope, the company said in a press release.

Quantum computers have the potential to solve problems that are impossible with current technologies, such as accelerating drug development and optimising logistics processes on an unprecedented scale.

But quantum chips are complex and difficult to produce. If they dont work as well as they should (and they often dont), there is no way to figure out why, which part failed and how to improve production processes. This is one of the biggest obstacles to scaling up quantum chips.

Quandela takes the quantum computer from lab to fab for first time

Quandela, the French pioneer of the photonic quantum computer, inaugurated the first factory for quantum computers in Europe.

QuantaMap has developed a new microscope that allows quantum researchers and chip manufacturers to accurately inspect each chip to ensure and improve quality. Imagine if every quantum researcher and chip manufacturer had a finely tuned compass to navigate the unknown quantum landscape of their chips; that is what we are creating, explains QuantaMaps CEO Johannes Jobst.

Others in this sector offer traditional measurement solutions, but QuantaMap stands out for its combination of a cryogenic microscope technology (scanning a fine needle over the surface of the chip at extremely low temperatures) with quantum sensors, both of which are specifically tailored for quantum applications.

The technology is fine-tuned to the specific problems affecting the performance and production yield of quantum chips, particularly around identifying the origin of electrical losses and impurities at the nanometre scale. This is done by imaging local temperature rise, electric currents and magnetic fields. This is all done at low temperatures to ensure that conditions under which the chip functions properly are maintained during inspection.

Competing technologies are all unsuitable for quantum chips; either because they interfere with the qubits while measuring, or because they have a high chance of damaging the chip during the imaging process.

Our unique sensors, intellectual property and advanced, quantum-centric approach give us years ahead of emerging competing technologies, says Jobst.

QuantaMap was founded in November 2022 by Johannes Jobst, Kaveh Lahabi, Milan Allan and Jimi de Haan. At Leiden University, Lahabi and his research team invented the quantum sensor at the heart of QuantaMaps products. As there was a clear need for diagnostic tools for quantum computing, the team moved quickly to fill this gap in the market. QuantaMap aims to become the backbone of chip research and development and quality control in the quantum computing industry.

See the original post here:
QuantaMap raises 1.4 million to overcome hurdle in quantum - Innovation Origins

Quantum supercomputing: IBM plots roadmap beyond Condor – ComputerWeekly.com

IBM has bolstered its supercomputing capabilities with the latest iteration of the companys quantum computer, Quantum System Two. Its the companys first modular quantum computer, and represents the cornerstone of IBMs quantum-centric supercomputing architecture.

The first IBM Quantum System Two, located in Yorktown Heights, New York, has begun operations with three IBM Heron processors and supporting control electronics.

We are firmly within the era in which quantum computers are being used as a tool to explore new frontiers of science, said Dario Gil, IBM senior vice-president and director of research. As we continue to advance how quantum systems can scale and deliver value through modular architectures, we will further increase the quality of a utility-scale quantum technology stack and put it into the hands of our users and partners, who will push the boundaries of more complex problems.

Following the companys quantum computing roadmap, IBM also unveiled Condor, a 1,121 superconducting qubit quantum processor based on what IBM calls cross-resonance gate technology.

According to IBM, Condor offers a 50% increase in qubit density and advances in qubit fabrication and laminate size, as well as over a mile of high-density cryogenic flex input/output wiring within a single dilution refrigerator. The new design is said to solve scale, and will be used to inform IBM on future hardware design.

Along with the new hardware, IBM unveiled an extension of its IBM Quantum Development Roadmap to 2033, where it plans to significantly advance the quality of gate operations. If it achieves its roadmap objectives, IBM said it will be able to increase the size of quantum circuits that can be run, which paves the way to realising the full potential of quantum computing at scale.

In a blog post giving an update on IBMs quantum computing plans, Jay Gambetta, vice-president of IBM Quantum, discussed experiments that demonstrate how quantum computers could run circuits beyond the reach of brute-force classical simulations. Quantum is now a computational tool, and what makes me most excited is that we can start to advance science in fields beyond quantum computing itself, he said.

But in the computational architecture Gambetta described, quantum technology will not run standalone. From these large-scale experiments, it has become clear that we must go beyond the traditional circuit model and take advantage of parallelism, concurrent classical computing and dynamic circuits, he said.

We have ample evidence that, with tools such as circuit knitting, we can enhance the reach of quantum computation, and new quantum algorithms are emerging that make use of multiple quantum circuits, potentially in parallel and with concurrent classical operations, said Gambetta. Its clear that a heterogeneous computing architecture consisting of scalable and parallel circuit execution and advanced classical computation is required.

This, he said, is IBMs vision for future high-performance systems, which he described as quantum-centric supercomputing.

Read more from the original source:
Quantum supercomputing: IBM plots roadmap beyond Condor - ComputerWeekly.com

New quantum chip, modular computer and SDK revealed by IBM – The Stack

IBM has revealed a new utility scale quantum processor, a landmark modular quantum computer, and teased the coming release of Qiskit 1.0 a significantly improved open source software development kit to build powerful quantum computing qubit circuits with comparative ease.

Extending its quantum computing roadmap out to 2033 meanwhile, Big Blue pledged to release a Blue Jay, a system capable of executing 1 billion gates across 2,000 qubits by 2033 a nine order-of-magnitude increase in performed gates since we put our first device on the cloud in 2016.

The trio of releases, made at the annual IBM Quantum Summit in New York, come six months after the company said it successfully worked around the quantum noise that introduces errors in calculations, to get reliable results at a scale beyond brute-force classical computation detailing that progress in a paper published in the journal Nature.

The techniques that enabled this represent a foundational tool for the realization of near-term quantum applications IBM said in June 2023.

Classical computing deploys bits that use the 0 and 1 vocabulary of binary code. Quantum computers use qubits that draw on two-state quantum-mechanical systems the ability of quantum particles to be in superposition; two different states at the same time.

As IBM Researchs Edwin Pednault puts it: A qubit can represent both 0 and 1 simultaneously in fact, in weighted combinations; for example, 37%-0, 63%-1. Three qubits can represent 2^3, or eight values simultaneously: 000, 001, 010, 011, 100, 101, 110, 111; 50 qubits can represent over one quadrillion values simultaneously.

Whilst classical computing circuits use ANDs and ORs and NOTs and XORs (binary gates) on which users build up higher level instructions, then support for languages like Java, Python, etc., quantum computers use different kinds of gates like CNOTs and Hadamards.

For quantum computing to work effectively, calculations need to keep going in superposition for the duration of the computational cycle.But they can easily be thrown off by noise ( the central obstacle to building large-scale quantum computers) which could stem from diverse sources including disturbances in Earths magnetic field, local radiation, cosmic rays, or the influence that qubits exert on each other by proximity.

This is in part tackled physically: signals for configuring and programming a quantum computer come from outside the machines, travel down coaxial cables where they are amplified and filtered, and eventually reach the quantum device with its qubits at ~0.015K (-273.135 degrees C) and noise tackled by minimising the exposure of the chips and cables to heat and electromagnetic radiation in all its forms, by minimizing device defects, by constantly improving the performance of the electronics, and by using all sorts of novel mathematical schemes to compensate for noise.

The Stack reviewed the three new releases and associated academic papers for our readers to distil precisely what IBM has/aims to achieve, as Dario Gil, IBM SVP and Director of Research pledged on December 4 to further increase the quality of a utility-scale quantum technology stack.

At the heart of its IBM Quantum System Two, a new modular quantum computer and cornerstone of IBM's quantum-centric supercomputing architecture is the new Quantum Heron 133-qubit processor.(This summers quantum achievements highlighted above were made on IBMs previous generation of semiconductor, its Quantum Eagle.)

The Quantum Heron offers a five-times improvement over the previous records set by IBM Eagle when it comes to reducing errors, IBM said. It is making the new chips available for users today via the cloud with more of the chips to join a utility-scale fleet of systems over the next year.

Featuring 133 fixed-frequency qubits with tunable couplers, Heron yields a 3-5x improvement in device performance over its 127-qubit Eagle processors, and virtually eliminates cross-talk IBMs Gil said, adding we have developed a qubit and the gate technology that were confident will form the foundation of our hardware roadmap going forward.

(A coupler helps determine the performance of a superconducting quantum computer. Tunable couplers link qubits and perform quantum computations by turning on and off the coupling between them.)

The chip is built with whats known as a heavy-hex processor architecture in which each unit cell of the lattice consists of a hexagonal arrangement of qubits, with an additional qubit on each edge.

As analyst Paul Smith-Goodson notes: The Herons modular architectureis different from previous quantum processor architecture.

The new architecture connects quantum processors to a common control infrastructure so that data can flow classically and in real time between the QPU and other chips in a multi-chip environment.

It also uses a new multi-qubit gate scheme that is both faster and provides higher fidelity. The Heron is the first IBM chip to use the new architecture that allows multiple processors to be linked using classical couplers to permit classical parallelization he added.

The new modular IBM Quantum System Two meanwhile combines what Big Blue described as scalable cryogenic infrastructure and classical runtime servers with modular qubit control electronics. As the building block for IBM's quantum computing roadmap, it will house IBM's future generations of quantum processors and be accessible via the cloud.

The system gets updated middleware too and after six years of development, IBM is gearing up for the release of Qiskit 1.0 early in Q1 2024. (Qiskit is an open-source SDK with extensive documentation for both the hardware and software layer and for working with quantum computers at the level of circuits, pulses, and algorithms that ships with has several domain specific application APIs on top of its core module.)

IBM touted what it described as a stable Qiskit focused on programming with Patterns, plus new set of tools using AI to help write and optimize Qiskit and QASM3 code the beta release of Quantum Serverless on the IBM Quantum Platform, to facilitate run remote execution Qiksit Patterns, in a quantum function style lets unpack this quantum verbiage!

A stable Qiskit is self-explanatory: After six years as a core SDK Qiskit has become what IBM describes as the lingua franca of quantum computing allowing programmers to write circuits, then execute them on hardware from more than eight different hardware manufacturers.

The 1.0 release adds stability, major improvements in memory footprint of circuits a claimed 55% decrease in memory usage compared to summer 2022s Qiskit 0.39 for example, and other improvements.

Qiskit patterns meanwhile are a collection of tools to simply map classical problems, optimize them to quantum circuits using Qiskit, executing those circuits using Qiskit Runtime, and then postprocess the results the release of a serverless execution option means users wont have to sit and wait over a stable network whilst a job is queued and executed but punt it out for managed execution, leave, and come back when the results are ready for you; combined, IBM thinks that these will democratise access to quantum computing and mean end-users do not [need to] be fluent in quantum circuits toutilize quantum computing.

Quantum computing, of course, is not immune to the allure of LLMs and IBM is also shipping a generative AI code assistant called Qiskit Code Assistant, based on the IBM Granite 20-billion-parameter Code Qiskit model, which was trained with about 370 million text tokens, based on a large collection of Qiskit examples and designed to remove some of the heavy lifting for programmers as they explore the suite of new tools.

Qubits, meanwhile, remain some distance from being the go-to solution for traditional computational problems, but IBM has and continues to be a genuine trail-blazer in the quantum computing space and as this summer's research showed, is making significant progress. A tipping point will arrive and then, the world will likely change. Those interested in exploring the shape of things to come could do worse than start with Qiskit.

Read the original:
New quantum chip, modular computer and SDK revealed by IBM - The Stack

Perceval Tech note: Introducing shots in a linear optic quantum computing framework – Medium

Article by Raksha Singla

The intrinsic characteristics of quantum computer hardware introduce a probabilistic element to their behavior. Youve likely encountered this explanation repeatedly, and it essentially signifies that when an input state undergoes processing in a quantum circuit, the resulting output state is very well defined by state evolution formula, but its measurement is probabilistic. Instead of getting access to the result of the calculation, as one would expect in classical computing, one obtain one measurement of this output state, the specific measurement following a probabilistic distribution obtained from the quantum state through what is known as the Born rule. For a software developer, this feature can be seen as a major bottleneck compared to classical computation, but on the other hand, this feature is actually giving the developer an indirect access to the huge computation space of quantum states marking a revolutionary shift in the computing landscape and enabling solving hard problems for classical computing.

A shot represents the execution of a quantum circuit and the corresponding data collected by the hardware at the output during that single run. Given the probabilistic nature of the system, conducting multiple iterations of the system (obtaining many shots) is necessary to gather data for statistical analysis of the algorithms operation.

The concept of shots is universally embraced by most quantum providers and forms an essential element in any quantum algorithm. Its implementation and management can vary among different frameworks, depending on the specific characteristics of the hardware system and the compute space.

How do we define shots at Quandela?

Our computing architecture works with linear optical elements forming the quantum circuit performing the processing action on states of photons (input source) in the Fock space (for an explanation of the Fock space see here). Detection of 1 or multiple photons by the detectors at the output define a single execution of a quantum experiment or an act of computing corresponding to the processing of data encoded into the output photons.

Our source is a single photon source emitting photons at a fixed rate into the chip implementing the circuit designed by a user. These photons may undergo absorption at various points within the hardware. Nevertheless, whenever at least one photon is detected at the optical chips output (termed as a photon-coincidence event), it marks the end of a single execution and the measured output constitutes a data sample. The count of such occurrences while running an algorithm represents our shot count.

A user may not necessarily want to sample single photon detections; they may specifically desire samples with a certain number N (>1) of photon coincidences and request these as the output. In such cases, the system may need to be run with a number of shots exceeding the requested number of samples, as multiple photon coincidences are anticipated. Recognizing this user preference, we have incorporated a tool in Perceval to estimate the necessary number of shots based on the users desired sample count. This tool conducts the estimation by considering the unitary of the implemented circuit, the input state, and the performance characteristics of the device.

How Shots will Revolutionize our Users Experience ?

Access to Quantum State Distribution:

In the light of the definition that characterizes Shots as the output detected during each circuit execution, they offer direct access to the probability distribution of a quantum state.

Predictible output rate:

In a photonic system characterized by instantaneous gate applications and a complex input state preparation timing (see detail on demultiplexing here), the time capture of shots clearly indicating the end of a single execution is exhibiting variability attributed to this input state time sequence, the actual configured circuit, and system transmittance factor. Working with shots guarantee a predictable output rate independent of these fluctuations.

Simplified User Interactions:

The incorporation of shots not only seeks to standardize user interactions with running algorithms on our Quantum Processing Unit (QPU) through our cloud services but also provides them with a more standardized parameter for understanding their resource needs. This enhancement contributes to a clearer and more consistent measure.

Predictability for Time and Cost:

Shots, being highly predictable, offer the most reliable means to estimate the time and cost of running an algorithm.

This stability in parameter counting results in fixed pricing, ensuring fairness to users and independence from the variability of the performance of the physical QPU device.

The rest is here:
Perceval Tech note: Introducing shots in a linear optic quantum computing framework - Medium

Quantum computers: IBM reveals Condor and Heron quantum CPUs – Notebookcheck.net

The Heron QPU with 133 qubits (Source: IBM Research/Flickr)

IBM recently presented two new quantum processors (QPUs) at the Quantum Summit 2023: the Condor QPU with 1121 qubits and the smaller Heron QPU with 133 qubits. IBM also unveiled the modular System Two quantum supercomputer.

Just a year ago, technology giant IBM introduced quantum processorOsprey,which runs at 433 qubits. This has now been succeeded by the IBM Condor, which clocks at an impressive 1121 qubits. While the new Quantum Processing Unit (QPU) from IBM is forced to concede defeat to Atom Computing with 1180 qubits, it has nonetheless managed to increase the qubit density by more than 50% compared to its in-house predecessor.

At first glance, the second QPU, which IBM recently presented on December 4 at the Quantum Summit 2023,appears less groundbreaking: The Heron quantum processor has 133 qubits and suceeds its Eagle predecessor, which had 127 qubits. While the IBM Condor QPU is primarily used to research how many qubits can fit on a quantum processor, the IBM Quantum Heron will be used in the modular quantum computer System Two.

IBM's first System Two, with three Heron QPUs, is reportedly already in operation at IBM's Yorktown Heights research site in the USA. Although the Heron quantum processor only offers 6 qubits more than its Eagle predecessor, the new 133-qubit QPU is said to deliver a three- to fivefold increase in performance over the Eagle QPU because quantum crosstalk has been virtually eliminated.

The IBM Quantum System Two in Yorktown Heights is 22 feet wide and 12 feet high, which is approximately 6.7 by 3.7 meters. This first System Two contains three Heron QPUs and computes in a near-perfect vacuum at temperatures lower than in space, namely below -270 degrees Celsius. This means that System Two presumably operates under conditions close to absolute zero at -273.15 degrees Celsius.

Growing up in regional Australia, I first became acquainted with computers in my early teens after a broken leg from a football (soccer) match temporarily condemned me to a predominately indoor lifestyle. Soon afterwards I was building my own systems. Now I live in Germany, having moved here in 2014, where I study philosophy and anthropology. I am particularly fascinated by how computer technology has fundamentally and dramatically reshaped human culture, and how it continues to do so.

Please share our article, every link counts!

Read this article:
Quantum computers: IBM reveals Condor and Heron quantum CPUs - Notebookcheck.net