Archive for the ‘Quantum Computer’ Category

Perceval Tech note: Introducing shots in a linear optic quantum computing framework – Medium

Article by Raksha Singla

The intrinsic characteristics of quantum computer hardware introduce a probabilistic element to their behavior. Youve likely encountered this explanation repeatedly, and it essentially signifies that when an input state undergoes processing in a quantum circuit, the resulting output state is very well defined by state evolution formula, but its measurement is probabilistic. Instead of getting access to the result of the calculation, as one would expect in classical computing, one obtain one measurement of this output state, the specific measurement following a probabilistic distribution obtained from the quantum state through what is known as the Born rule. For a software developer, this feature can be seen as a major bottleneck compared to classical computation, but on the other hand, this feature is actually giving the developer an indirect access to the huge computation space of quantum states marking a revolutionary shift in the computing landscape and enabling solving hard problems for classical computing.

A shot represents the execution of a quantum circuit and the corresponding data collected by the hardware at the output during that single run. Given the probabilistic nature of the system, conducting multiple iterations of the system (obtaining many shots) is necessary to gather data for statistical analysis of the algorithms operation.

The concept of shots is universally embraced by most quantum providers and forms an essential element in any quantum algorithm. Its implementation and management can vary among different frameworks, depending on the specific characteristics of the hardware system and the compute space.

How do we define shots at Quandela?

Our computing architecture works with linear optical elements forming the quantum circuit performing the processing action on states of photons (input source) in the Fock space (for an explanation of the Fock space see here). Detection of 1 or multiple photons by the detectors at the output define a single execution of a quantum experiment or an act of computing corresponding to the processing of data encoded into the output photons.

Our source is a single photon source emitting photons at a fixed rate into the chip implementing the circuit designed by a user. These photons may undergo absorption at various points within the hardware. Nevertheless, whenever at least one photon is detected at the optical chips output (termed as a photon-coincidence event), it marks the end of a single execution and the measured output constitutes a data sample. The count of such occurrences while running an algorithm represents our shot count.

A user may not necessarily want to sample single photon detections; they may specifically desire samples with a certain number N (>1) of photon coincidences and request these as the output. In such cases, the system may need to be run with a number of shots exceeding the requested number of samples, as multiple photon coincidences are anticipated. Recognizing this user preference, we have incorporated a tool in Perceval to estimate the necessary number of shots based on the users desired sample count. This tool conducts the estimation by considering the unitary of the implemented circuit, the input state, and the performance characteristics of the device.

How Shots will Revolutionize our Users Experience ?

Access to Quantum State Distribution:

In the light of the definition that characterizes Shots as the output detected during each circuit execution, they offer direct access to the probability distribution of a quantum state.

Predictible output rate:

In a photonic system characterized by instantaneous gate applications and a complex input state preparation timing (see detail on demultiplexing here), the time capture of shots clearly indicating the end of a single execution is exhibiting variability attributed to this input state time sequence, the actual configured circuit, and system transmittance factor. Working with shots guarantee a predictable output rate independent of these fluctuations.

Simplified User Interactions:

The incorporation of shots not only seeks to standardize user interactions with running algorithms on our Quantum Processing Unit (QPU) through our cloud services but also provides them with a more standardized parameter for understanding their resource needs. This enhancement contributes to a clearer and more consistent measure.

Predictability for Time and Cost:

Shots, being highly predictable, offer the most reliable means to estimate the time and cost of running an algorithm.

This stability in parameter counting results in fixed pricing, ensuring fairness to users and independence from the variability of the performance of the physical QPU device.

The rest is here:
Perceval Tech note: Introducing shots in a linear optic quantum computing framework - Medium

Quantum computers: IBM reveals Condor and Heron quantum CPUs – Notebookcheck.net

The Heron QPU with 133 qubits (Source: IBM Research/Flickr)

IBM recently presented two new quantum processors (QPUs) at the Quantum Summit 2023: the Condor QPU with 1121 qubits and the smaller Heron QPU with 133 qubits. IBM also unveiled the modular System Two quantum supercomputer.

Just a year ago, technology giant IBM introduced quantum processorOsprey,which runs at 433 qubits. This has now been succeeded by the IBM Condor, which clocks at an impressive 1121 qubits. While the new Quantum Processing Unit (QPU) from IBM is forced to concede defeat to Atom Computing with 1180 qubits, it has nonetheless managed to increase the qubit density by more than 50% compared to its in-house predecessor.

At first glance, the second QPU, which IBM recently presented on December 4 at the Quantum Summit 2023,appears less groundbreaking: The Heron quantum processor has 133 qubits and suceeds its Eagle predecessor, which had 127 qubits. While the IBM Condor QPU is primarily used to research how many qubits can fit on a quantum processor, the IBM Quantum Heron will be used in the modular quantum computer System Two.

IBM's first System Two, with three Heron QPUs, is reportedly already in operation at IBM's Yorktown Heights research site in the USA. Although the Heron quantum processor only offers 6 qubits more than its Eagle predecessor, the new 133-qubit QPU is said to deliver a three- to fivefold increase in performance over the Eagle QPU because quantum crosstalk has been virtually eliminated.

The IBM Quantum System Two in Yorktown Heights is 22 feet wide and 12 feet high, which is approximately 6.7 by 3.7 meters. This first System Two contains three Heron QPUs and computes in a near-perfect vacuum at temperatures lower than in space, namely below -270 degrees Celsius. This means that System Two presumably operates under conditions close to absolute zero at -273.15 degrees Celsius.

Growing up in regional Australia, I first became acquainted with computers in my early teens after a broken leg from a football (soccer) match temporarily condemned me to a predominately indoor lifestyle. Soon afterwards I was building my own systems. Now I live in Germany, having moved here in 2014, where I study philosophy and anthropology. I am particularly fascinated by how computer technology has fundamentally and dramatically reshaped human culture, and how it continues to do so.

Please share our article, every link counts!

Read this article:
Quantum computers: IBM reveals Condor and Heron quantum CPUs - Notebookcheck.net

A Physicist Reveals the One Quantum Breakthrough That Could Disrupt Scientific Innovation – Inverse

Quantum advantage is the milestone the field of quantum computing is fervently working toward, where a quantum computer can solve problems that are beyond the reach of the most powerful non-quantum or classical computers.

Quantum refers to the scale of atoms and molecules where the laws of physics as we experience them break down, and a different, counterintuitive set of laws apply. Quantum computers take advantage of these strange behaviors to solve problems.

There are some types of problems that are impractical for classical computers to solve, such as cracking state-of-the-art encryption algorithms. Research in recent decades has shown that quantum computers have the potential to solve some of these problems. If a quantum computer can be built that actually does solve one of these problems, it will have demonstrated quantum advantage.

I am a physicist who studies quantum information processing and the control of quantum systems. I believe that this frontier of scientific and technological innovation not only promises groundbreaking advances in computation but also represents a broader surge in quantum technology, including significant advancements in quantum cryptography and quantum sensing.

Central to quantum computing is the quantum bit, or qubit. Unlike classical bits, which can only be in states of 0 or 1, a qubit can be in any state that is some combination of 0 and 1. This state of neither just one nor just 0 is known as a quantum superposition. With every additional qubit, the number of states that can be represented by the qubits doubles.

This property is often mistaken for the source of the power of quantum computing. Instead, it comes down to an intricate interplay of superposition, interference, and entanglement.

Interference involves manipulating qubits so that their states combine constructively during computations to amplify correct solutions and destructively suppress the wrong answers. Constructive interference is what happens when the peaks of two waves like sound waves or ocean waves combine to create a higher peak. Destructive interference is what happens when a wave peak and a wave trough combine and cancel each other out. Quantum algorithms, which are few and difficult to devise, set up a sequence of interference patterns that yield the correct answer to a problem.

Entanglement establishes a uniquely quantum correlation between qubits: The state of one cannot be described independently of the others, no matter how far apart the qubits are. This is what Albert Einstein famously dismissed as spooky action at a distance. Entanglements collective behavior, orchestrated through a quantum computer, enables computational speed-ups that are beyond the reach of classical computers.

Quantum computing has a range of potential uses where it can outperform classical computers. In cryptography, quantum computers pose both an opportunity and a challenge. Most famously, they have the potential to decipher current encryption algorithms, such as the widely used RSA scheme.

One consequence of this is that todays encryption protocols need to be re-engineered to be resistant to future quantum attacks. This recognition has led to the burgeoning field of post-quantum cryptography. After a long process, the National Institute of Standards and Technology recently selected four quantum-resistant algorithms and has begun the process of readying them so that organizations around the world can use them in their encryption technology.

In addition, quantum computing can dramatically speed up quantum simulation: the ability to predict the outcome of experiments operating in the quantum realm. Famed physicist Richard Feynman envisioned this possibility more than 40 years ago. Quantum simulation offers the potential for considerable advancements in chemistry and materials science, aiding in areas such as the intricate modeling of molecular structures for drug discovery and enabling the discovery or creation of materials with novel properties.

Another use of quantum information technology is quantum sensing: detecting and measuring physical properties like electromagnetic energy, gravity, pressure, and temperature with greater sensitivity and precision than non-quantum instruments. Quantum sensing has myriad applications in fields such as environmental monitoring, geological exploration, medical imaging, and surveillance.

Initiatives such as the development of a quantum internet that interconnects quantum computers are crucial steps toward bridging the quantum and classical computing worlds. This network could be secured using quantum cryptographic protocols such as quantum key distribution, which enables ultra-secure communication channels that are protected against computational attacks including those using quantum computers.

Despite a growing application suite for quantum computing, developing new algorithms that make full use of the quantum advantage in particular in machine learning remains a critical area of ongoing research.

The quantum computing field faces significant hurdles in hardware and software development. Quantum computers are highly sensitive to any unintentional interactions with their environments. This leads to the phenomenon of decoherence, where qubits rapidly degrade to the 0 or 1 states of classical bits.

Building large-scale quantum computing systems capable of delivering on the promise of quantum speed-ups requires overcoming decoherence. The key is developing effective methods of suppressing and correcting quantum errors, an area my own research is focused on.

In navigating these challenges, numerous quantum hardware and software startups have emerged alongside well-established technology industry players like Google and IBM. This industry interest, combined with significant investment from governments worldwide, underscores a collective recognition of quantum technologys transformative potential. These initiatives foster a rich ecosystem where academia and industry collaborate, accelerating progress in the field.

Quantum computing may one day be as disruptive as the arrival of generative AI. Currently, the development of quantum computing technology is at a crucial juncture. On the one hand, the field has already shown early signs of having achieved a narrowly specialized quantum advantage. Researchers at Google and later a team of researchers in China demonstrated a quantum advantage for generating a list of random numbers with certain properties. My research team demonstrated a quantum speed-up for a random number guessing game.

On the other hand, there is a tangible risk of entering a quantum winter, a period of reduced investment if practical results fail to materialize in the near term.

While the technology industry is working to deliver quantum advantage in products and services in the near term, academic research remains focused on investigating the fundamental principles underpinning this new science and technology. This ongoing basic research, fueled by enthusiastic cadres of new and bright students of the type I encounter almost every day, ensures that the field will continue to progress.

This article was originally published on The Conversation by Daniel Lidar at the University of Southern California. Read the original article here.

See original here:
A Physicist Reveals the One Quantum Breakthrough That Could Disrupt Scientific Innovation - Inverse

The Quantum Computing Revolution on 60 Minutes – Daily Kos

OnCBS 60 Minutestonight, there was a fascinating look at the quantum computing revolution currently being developed at places like Google, IBM, Microsoft, and Honeywell. Unlike conventional computers using transistors in which each bit of information is binary (0 or 1), quantum computers use superconducting qubits at a temperature near absolute zero (-460F) in which individual electrons can store information over an infinite range of values, so linking these qubits permits an exponential increase in computing power, rather than just a linear increase with transistors. For example, while we can now produce computer chips containing over a trillion transistors, an equivalent quantum computing power can be obtained with just 40 qubits linked together (2^40).

The main technological challenge is maintaining a quantum coherence between these qubits, but tomorrow IBM will be unveiling its Quantum System Twowhich has three times the number of qubits as its first prototype. Some of the key exchanges from the 60 Minutes piece:

Mitigating those errors and extending coherence time while scaling up to larger machines are the challenges facing German-American scientist Hartmut Neven, who founded Google's lab, and its casual style, in 2012.

Scott Pelley:Can the problems that are in the way of quantum computing be solved?

Hartmut Neven: I should confess, my subtitle here is chief optimist. After having said this, I would say at this point, we don't need any more fundamental breakthroughs. We need little improvements here and there. If we have all the pieces together, we just need to integrate them well to build larger and larger systems.

Scott Pelley: And you think that all of this will be integrated into a system in what period of time?

Hartmut Neven: Yeah. We often say we wanna do it by the end of the decade so that we can use this Kennedy quote, "Get it done by the end of the decade."

Scott Pelley: The end of this decade?

Hartmut Neven: Yes.

IBM's Dario Gil told us System Two has the room to expand to thousands of qubits.

Scott Pelley: What are the chances that this is one of those things that's gonna be ready in five years and always will be?

Dario Gil: We don't see an obstacle right now that would prevent us from building systems that will have tens of thousands and even a 100 thousand qubits working with each other. So we are highly confident that we will get there.

This is a truly mind-blowing look at the future of computing, and the implications are absolutely staggering. For more background info on this subject, I would recommend Michio KakusQuantum Supremacy.

Continued here:
The Quantum Computing Revolution on 60 Minutes - Daily Kos

The Quantum Computing Threat to Encryption and Cybersecurity – Medium

Photo by Fractal Hassan on Unsplash

Quantum computing is an incredibly promising innovation but it also jeopardizes current data protection methods. This emerging field requires an urgent collaborative response to safeguard privacy.

Quantum computers leverage quantum mechanics principles like superposition and entanglement to perform calculations exponentially faster than regular machines for certain tasks.

Through parallel computation on a massive scale, they hold huge promise for challenges from chemical simulations to machine learning.

Global tech giants like IBM and emerging startups have pioneered early but extremely powerful prototypes. However, unlocking the immense potential of these machines also necessitates upgrading crucial cybersecurity foundations built in a pre-quantum age.

Encryption protocols most of the digital world relies on remain dangerously exposed. As quantum hardware continues rapid advances, failure to future-proof security risks compromising privacy on an unprecedented scale.

A world with advanced quantum computers puts all current encrypted data at risk of interception and misuse. No existing encryption method would remain reliably secure.

Pretty much all sensitive data transmitted online - from financial records to government secrets and personal emails - depends on mathematical encryption techniques to prevent interception.

The most common public key encryption schemes used today like RSA, ECC and Diffie-Hellman base their security on the extreme difficulty for regular computers to factor very large prime numbers. This allows easy encryption by multiplying two large primes but makes decryption essentially impossible through brute computational force.

However, quantum computers can run algorithms like Shor's to quickly factor these large numbers and break the encryption. Where even the most advanced supercomputer would take millennia, a powerful enough quantum computer could unravel the security on such data in minutes.

Here is the original post:
The Quantum Computing Threat to Encryption and Cybersecurity - Medium