Archive for the ‘Quantum Computer’ Category

IBM wants to build a 100,000-qubit quantum computer in 10 years – Fudzilla

Call of Duty specs will be high

IBM wants to build a 100,000-qubit quantum computing machine within the next 10 years, according to MIT Technology Review.

Biggish Blue has managed to build a 433-quantum bit, or qubits machine so far, making it the world leader. IBM announced the move at the G7 summit in Hiroshima, Japan. The company will partner with the University of Tokyo and the University of Chicago in a 100-million-dollar initiative to push quantum computing into full-scale operation, where the technology could tackle pressing problems that no standard supercomputer can solve.

The 100,000 qubits machine will work alongside supercomputers to achieve breakthroughs in drug discovery, fertiliser production, battery performance, and other applications. It is unclear were potentially dead or alive cats will be involved.

IBM's VP of quantum, Jay Gambetta, told MIT Technology Review that IBM has already done proof-of-principle experiments (PDF) showing that integrated circuits based on "complementary metal oxide semiconductor" (CMOS) technology can be installed next to the cold qubits to control them with just tens of milliwatts.

However, beyond that, he admits, the technology required for quantum-centric supercomputing does not yet exist: that is why academic research is a vital part of the project.

The qubits will exist on a modular chip that is only beginning to take shape in IBM labs.

Modularity, essential when it will be impossible to put enough qubits on a single chip, requires interconnects that transfer quantum information between modules.

IBM's "Kookaburra," a 1,386-qubit multichip processor with a quantum communication link, is under development and slated for release in 2025. Gambetta says that boffins at Tokyo and Chicago have already made significant strides in components and communication innovations that could be vital parts of the final product.

Gambetta thinks there will likely be many more industry-academic partnerships over the next decade.

Original post:
IBM wants to build a 100,000-qubit quantum computer in 10 years - Fudzilla

From self-driving cars to the military: quantum computing can help … – Cosmos

Muhammad Usman, CSIRO

Artificial intelligence algorithms are quickly becoming a part of everyday life. Many systems that require strong security are either already underpinned by machine learning or soon will be. These systems include facial recognition, banking, military targeting applications, and robots and autonomous vehicles, to name a few.

This raises an important question: how secure are these machine learning algorithms against malicious attacks?

In an article published in Nature Machine Intelligence, my colleagues at the University of Melbourne and I discuss a potential solution to the vulnerability of machine learning models.

We propose that the integration of quantum computing in these models could yield new algorithms with strong resilience against adversarial attacks.

Machine learning algorithms can be remarkably accurate and efficient for many tasks. They are particularly useful for classifying and identifying image features. However, theyre also highly vulnerable to data manipulation attacks, which can pose serious security risks.

Data manipulation attacks which involve the very subtle manipulation of image data can be launched in several ways. An attack may be launched by mixing corrupt data into a training dataset used to train an algorithm, leading it to learn things it shouldnt.

Manipulated data can also be injected during the testing phase (after training is complete), in cases where the AI system continues to train the underlying algorithms while in use.

People can even carry out such attacks from the physical world. Someone could put a sticker on a stop sign to fool a self-driving cars AI into identifying it as a speed-limit sign. Or, on the front lines, troops might wear uniforms that can fool AI-based drones into identifying them as landscape features.

Either way, the consequences of data manipulation attacks can be severe. For example, if a self-driving car uses a machine learning algorithm that has been compromised, it may incorrectly predict there are no humans on the road when there are.

In our article, we describe how integrating quantum computing with machine learning could give rise to secure algorithms called quantum machine learning models.

These algorithms are carefully designed to exploit special quantum properties that would allow them to find specific patterns in image data that arent easily manipulated. The result would be resilient algorithms that are safe against even powerful attacks. They also wouldnt require the expensive adversarial training currently used to teach algorithms how to resist such attacks.

Beyond this, quantum machine learning could allow for faster algorithmic training and more accuracy in learning features.

Todays classical computers work by storing and processing information as bits, or binary digits, the smallest unit of data a computer can process. In classical computers, which follow the laws of classical physics, bits are represented as binary numbers specifically 0s and 1s.

Get an update of science stories delivered straight to your inbox.

Quantum computing, on the other hand, follows principles used in quantum physics. Information in quantum computers is stored and processed as qubits (quantum bits) which can exist as 0, 1, or a combination of both at once. A quantum system that exists in multiple states at once is said to be in a superposition state. Quantum computers can be used to design clever algorithms that exploit this property.

However, while there are significant potential benefits in using quantum computing to secure machine learning models, it could also be a double-edged sword.

On one hand, quantum machine learning models will provide critical security for many sensitive applications. On the other, quantum computers could be used to generate powerful adversarial attacks, capable of easily deceiving even state-of-the-art conventional machine learning models.

Moving forward, well need to seriously consider the best ways to protect our systems; an adversary with access to early quantum computers would pose a significant security threat.

The current evidence suggests were still some years away from quantum machine learning becoming a reality, due to limitations in the current generation of quantum processors.

Todays quantum computers are relatively small (with fewer than 500 qubits) and their error rates are high. Errors may arise for several reasons, including imperfect fabrication of qubits, errors in the control circuitry, or loss of information (called quantum decoherence) through interaction with the environment.

Still, weve seen enormous progress in quantum hardware and software over the past few years. According to recent quantum hardware roadmaps, its anticipated quantum devices made in coming years will have hundreds to thousands of qubits.

These devices should be able to run powerful quantum machine learning models to help protect a large range of industries that rely on machine learning and AI tools.

Worldwide, governments and private sectors alike are increasing their investment in quantum technologies.

This month the Australian government launched the National Quantum Strategy, aimed at growing the nations quantum industry and commercialising quantum technologies. According to the CSIRO, Australias quantum industry could be worth about A$2.2 billion by 2030.

Muhammad Usman, Principal Research Scientist and Team Leader, CSIRO

This article is republished from The Conversation under a Creative Commons license. Read the original article.

View original post here:
From self-driving cars to the military: quantum computing can help ... - Cosmos

Reducing CNOT count in quantum Fourier transform for the linear … – Nature.com

Quantum algorithms are becoming important because of their accelerated processing speed over classical algorithms for solving complex problems1,2,3,4,5. However, using quantum algorithms to solve practical problems is difficult because quantum states are very susceptible to noise, which can cause critical errors in the execution of quantum algorithms. In other words, quantum errors caused by noise pose a major obstacle to the realization of quantum algorithms.

The quantum circuit model is a well-known model for quantum computation. In this model, quantum algorithms are represented by quantum circuits composed of qubits and gates. Since noise arises from the evolution of quantum states, gate operations are the major cause of noise. Therefore, quantum circuits should be designed with a minimal number of gates, especially in the noisy intermediate-scale quantum (NISQ) arena6,7.

Within the realm of quantum logic synthesis, quantum circuits are broken down into gates derived from a universal gate library. The basic gate library consists of CNOT and single-qubit gates8,9. Since CNOT gates are considered the main generators of quantum errors and have a longer execution time compared to single-qubit gates10, CNOT gates are expected to dominate the cost of quantum circuits when using the basic gate library.

When considering the cost of a quantum circuit, connectivity between qubits should also be taken into account. This is because physical limitations in quantum hardware may enforce quantum circuits to adopt the nearest-neighbor (NN) architecture10,11. The NN architecture means that a qubit in the circuit only interacts with adjacent qubits.

The quantum Fourier transform (QFT) is an essential tool for many quantum algorithms, such as quantum addition12, quantum phase estimation (QPE)13, quantum amplitude estimation (QAE)3, the algorithm for solving linear systems of equations4, and Shors factoring algorithm1, to name a few. Therefore, the cost optimization of QFT would result in the efficiency improvement of these quantum algorithms.

There have been studies aimed at reducing circuit costs of QFT8,14,15,16,17,18,19,20,21,22. Among them are studies related to the number of CNOT gates in QFT, including the following:

When constructing an (n)-qubit QFT circuit using the basic gate library, (n(n-1)) CNOT gates are required, provided that qubit reordering is allowed8. Qubit reordering implies that the sequence of qubits can be altered before and after the execution of the circuit.

In Ref.14, the authors incorporated (n(n-1)/2) extra SWAP gates to develop an (n)-qubit linear nearest-neighbor (LNN) QFT circuit, which accommodates qubit reordering.

To synthesize a single SWAP gate using the basic gate library, three CNOT gates are required8.

Consequently, the total number of CNOT gates required for the (n)-qubit LNN QFT circuit presented in Ref.14 is (5n(n-1)/2).

By employing SWAP gates in the construction of LNN QFT circuits, the primary term representing the quantity of CNOT gates increases by a factor of 2.5.

Previous research efforts, as documented in case studies, have investigated techniques to minimize the amount of SWAP gates required in the LNN architecture when assembling (n)-qubit LNN QFT circuits15,16,17,18. These studies aimed to optimize the circuit design and improve overall efficiency.

In this paper, we propose a new n-qubit LNN QFT circuit design that directly utilizes CNOT gates, unlike previous studies14,15,16,17,18 that utilized SWAP gates. Our approach offers a significant advantage by synthesizing a more compact QFT circuit using CNOT gates instead of SWAP gates, as the implementation of each SWAP gate requires three CNOT gates. Upon qubit reordering, our (n)-qubit LNN QFT circuit requires ({n}^{2}+n-4) CNOT gates, which are 40% of those in Ref.14 asymptotically. Furthermore, we demonstrate that our circuit design significantly reduces the number of CNOT gates compared to the best-known results for 5- to 10-qubit LNN QFT circuits17,18.

In the following analysis, we compare our QFT circuit with the conventional QFT circuit8 when used as inputs for the Qiskit transpiler23, which is required for implementation on IBM quantum computers that necessitate NN architecture10. Our findings confirm that using our QFT circuit as input requires fewer CNOT gates in comparison to the conventional QFT circuits. This evidence indicates that our QFT circuit design could serve as a foundation for synthesizing QFT circuits that are compatible with NN architecture, potentially leading to more efficient implementations.

Furthermore, we present experimental results from implementing the QPE using 3-qubit QFTs on actual quantum hardware, specifically the IBM_Nairobi10 and Rigetti Aspen-1111 systems. We also illustrate the decomposition of controlled-({R}_{y}) gates that share a target qubit using our proposed method. This particular circuit is often found in QAE, which is anticipated to supplant classical Monte Carlo integration methods24,25. By providing these results, we aim to highlight the practicality and effectiveness of our approach in real-world quantum computing applications.

The remainder of this paper is organized as follows: in the Background section, we provide a brief overview of quantum circuits, QFT, QPE, and QAE. The proposed approach section outlines our method for constructing LNN QFT circuits. In the resultsand discussion section, we present the outcomes of transpilation on IBM quantum computers, display the experimental results of QPE executions on quantum hardware, and illustrate how to convert a circuit of controlled-({R}_{y}) gates sharing the target qubit into an LNN circuit using our proposed method. We also address the limitations of our study and suggest potential future research directions. Finally, we conclude the paper with a summary of our findings and their implications for the field of quantum computing.

See the original post:
Reducing CNOT count in quantum Fourier transform for the linear ... - Nature.com

Australia-US compact to help keep up with the bad guys – Yahoo News Australia

A recently signed pact with the United States is expected to deliver benefits beyond clean energy or new mines by also boosting national security.

Company bosses and policymakers are still coming to grips with what the deal signed last weekend will mean.

But it's clear the Australia-US Climate, Critical Minerals, and Clean Energy Transformation Compact will pull the two industrial bases together against China's might.

"Nothing but goodness can come from that agreement," cyber security expert Tony Burnside told AAP.

"It makes us more secure."

The pact signalled an intent to deepen collaboration on the materials and know-how that are vital to defence supply chains and clean energy.

Mr Burnside, vice-president at Netskope, said the US was likely to experience certain attacks before anyone else, and information sharing would be crucial.

"In our industry, we collaborate with vendors for the same reason - they may see something, an attack on a computer before we do, or vice versa," hesaid.

Netskope is a global leader in cloud, data and network security and in managing the safe use of artificial intelligence and robotics.

"Many of us have been in the space for years," Mr Burnside said.

But with the spread of chatbot ChatGPT and Google's conversational version called Bard, change is accelerating and Australia must get on board.

"We've got to really keep up with the bad guys out there and, from a competitive standpoint, the other countries that are leveraging it," Mr Burnside said.

He said quantum computing will be key. It's so powerful that it will outmatch existing encryption standards now protecting systems and data.

"We can't rule out some states using quantum computing for cyber warfare in the upcoming years," he said.

"And it won't be too long before hackers also get access to quantum computing."

Quantum-resistant encryption standards are in the works, and the US government has already asked all of its agencies and their suppliers to use quantum-resistant encryption by 2035.

Story continues

President Joe Biden will ask the US Congress to add Australia as a "domestic source" under the US Defense Production Act.

The designation means Australia will be the second country after Canada to be considered for the special status.

It will enable US industry to increase production and investment in Australia, including critical minerals and defence technologies, and allow local firmsto apply for US funding.

"It's important to note, this would not give the US government any authority to direct Australian industry," a spokesperson for the Department of Prime Minister and Cabinet told AAP.

Minerals Council chief executive Tania Constable also wants Australia to makenational security part of the critical minerals boom.

Competing with trillion-dollar tax incentives and funding in North America and Europe, Australia's $15 billion National Reconstruction Fund includes $1 billion for critical technologies and $1 billion for advanced manufacturing.

Australia's new critical technologies list is already out and has the stamp of approval from Netskope and other industry leaders for backing quantum computing and AI.

"Bringing in defence procurement is really important and the links that are made with other countries like the United States on those," Ms Constable told AAP.

"We need to think more broadly than just direct funding," she said.

"But there will be more federal budgets to come and hopefully the critical minerals strategy (due out soon) will set out a pathway and framework for those additional incentives."

Meanwhile, the latest critical technologies list has the stamp of approval from Netskope and other industry leaders in space and defence, advanced manufacturing and energy.

Read more:
Australia-US compact to help keep up with the bad guys - Yahoo News Australia

Quantum computers: what are they good for? – Nature.com

Bavarian science minister Markus Blume views part of a quantum computer with Dieter Kranzlmller (left) at the Leibniz Supercomputing Center.Credit: Sven Hoppe/dpa/Alamy

Most researchers have never seen a quantum computer. Winfried Hensinger has five. Theyre all terrible, he says. They cant do anything useful.

In fact, all quantum computers could be described as terrible. Decades of research have yet to yield a machine that can kick off the promised revolution in computing. But enthusiasts arent concerned and development is proceeding better than expected, researchers say.

Im not trying to take away from how much work there is to do, but were surprising ourselves about how much weve done, says Jeannette Garcia, senior research manager for quantum applications and software at technology giant IBM in San Jose, California.

Nature Spotlight: Quantum computing

Hensinger, a physicist at the University of Sussex in Brighton, UK, published a proof of principle in February for a large-scale, modular quantum computer1. His start-up company, Universal Quantum in Haywards Heath, UK, is now working with engineering firm Rolls-Royce in London and others to begin the long and arduous process of building it.

If you believe the hype, computers that exploit the strange behaviours of the atomic realm could accelerate drug discovery, crack encryption, speed up decision-making in financial transactions, improve machine learning, develop revolutionary materials and even address climate change. The surprise is that those claims are now starting to seem a lot more plausible and perhaps even too conservative.

According to computational mathematician Steve Brierley, whatever the quantum sweet spot turns out to be, it could be more spectacular than anything we can imagine today if the field is given the time it needs. The short-term hype is a bit high, says Brierley, who is founder and chief executive of quantum-computing firm Riverlane in Cambridge, UK. But the long-term hype is nowhere near enough.

Until now, there has been good reason to be sceptical. Researchers have obtained only mathematical proofs that quantum computers will offer large gains over current, classical computers in simulating quantum physics and chemistry, and in breaking the public-key cryptosystems used to protect sensitive communications such as online financial transactions. All of the other use cases that people talk about are either more marginal, more speculative, or both, says Scott Aaronson, a computer scientist at the University of Texas at Austin. Quantum specialists have yet to achieve anything truly useful that could not be done using classical computers.

The problem is compounded by the difficulty of building the hardware itself. Quantum computers store data in quantum binary digits called quantum bits, or qubits, that can be made using various technologies, including superconducting rings; optical traps; and photons of light. Some technologies require cooling to near absolute zero, others operate at room temperature. Hensingers blueprint is for a machine the size of a football pitch, but others could end up installed in cars. Researchers cannot even agree on how the performance of quantum computers should be measured.

Whatever the design, the clever stuff happens when qubits are carefully coaxed into superposition states of indefinite character essentially a mix of digital ones and zeroes, rather than definitely being one or the other. Running algorithms on a quantum computer involves directing the evolution of these superposition states. The quantum rules of this evolution allow the qubits to interact to perform computations that are, in practical terms, impossible using classical computers.

That said, useful computations are possible only on quantum machines with a huge number of qubits, and those do not yet exist. Whats more, qubits and their interactions must be robust against errors introduced through the effects of thermal vibrations, cosmic rays, electromagnetic interference and other sources of noise. These disturbances can cause some of the information necessary for the computation to leak out of the processor, a situation known as decoherence. That can mean dedicating a large proportion of the qubits to error-correction routines that keep a computation on track.

A circuit design for IBMs five-qubit superconducting quantum computer.Credit: IBM Research/SPL

This is where the scepticism about quantum computing begins. The worlds largest quantum computer in terms of qubits is IBMs Osprey, which has 433. But even with 2 million qubits, some quantum chemistry calculations might take a century, according to a 2022 preprint2 by researchers at Microsoft Quantum in Redmond, Washington, and ETH Zurich in Switzerland. Research published in 2021 by scientists Craig Gidney at Google in Santa Barbara, California, and Martin Eker at the KTH Royal Institute of Technology in Stockholm, estimates that breaking state-of-the-art cryptography in 8 hours would require 20 million qubits3.

Yet, such calculations also offer a source of optimism. Although 20 million qubits looks out of reach, its a lot less than the one billion qubits of previous estimates4. And researcher Michael Beverland at Microsoft Quantum, who was first author of the 2022 preprint2, thinks that some of the obstacles facing quantum chemistry calculations can be overcome through hardware breakthroughs.

For instance, Nicole Holzmann, who leads the applications and algorithms team at Riverlane, and her colleagues have shown that quantum algorithms to calculate the ground-state energies of around 50 orbital electrons can be made radically more efficient5. Previous estimates of the runtime of such algorithms had come in at more than 1,000 years. But Holzmann and her colleagues found that tweaks to the routines altering how the algorithmic tasks are distributed around the various quantum logic gates, for example cut the theoretical runtime to just a few days. Thats a gain in speed of around five orders of magnitude. Different options give you different results, Holzmann says, and we havent thought about many of these options yet.

At IBM, Garcia is starting to exploit these gains. In many ways, its easy pickings: the potential quantum advantage isnt limited to calculations involving vast arrays of molecules.

One example of a small-scale but classically intractable computation that might be possible on a quantum machine is finding the energies of ground and excited states of small photoactive molecules, which could improve lithography techniques for semiconductor manufacturing and revolutionize drug design. Another is simulating the singlet and triplet states of a single oxygen molecule, which is of interest to battery researchers.

In February, Garcias team published6 quantum simulations of the sulfonium ion (H3S+). That molecule is related to triphenyl sulfonium (C18H15S), a photo-acid generator used in lithography that reacts to light of certain wavelengths. Understanding its molecular and photochemical properties could make the manufacturing technique more efficient, for instance. When the team began the work, the computations looked impossible, but advances in quantum computing over the past three years have allowed the researchers to perform the simulations using relatively modest resources: the H3S+ computation ran on IBMs Falcon processor, which has just 27 qubits.

Part of the IBM teams gains are the result of measures that reduce errors in the quantum computers. These include error mitigation, in which noise is cancelled out using algorithms similar to those in noise-cancelling headphones, and entanglement forging, which identifies parts of the quantum circuit that can be separated out and simulated on a classical computer without losing quantum information. The latter technique, which effectively doubles the available quantum resources, was invented only last year7.

Michael Biercuk, a quantum physicist at the University of Sydney in Australia, who is chief executive and founder of Sydney-based start-up firm Q-CTRL, says such operational tweaks are ripe for exploration. Biercuks work aims to dig deeper into the interfaces between the quantum circuits and the classical computers used to control them, as well as understand the details of other components that make up a quantum computer. There is a lot of space left on the table, he says; early reports of errors and limitations have been naive and simplistic. We are seeing that we can unlock extra performance in the hardware, and make it do things that people didnt expect.

Similarly, Riverlane is making the daunting requirements for a useful quantum computer more manageable. Brierley notes that drug discovery and materials-science applications might require quantum computers that can perform a trillion decoherence-free operations by current estimates and thats good news. Five years ago, that was a million trillion, he says.

Some firms are so optimistic that they are even promising useful commercial applications in the near future. Helsinki-based start-up Algorithmiq, for instance, says it will be able to demonstrate practical quantum advances in drug development and discovery in five years time. Were confident about that, says Sabrina Maniscalco, Algorithmiqs co-founder and chief executive, and a physicist at the University of Helsinki.

Maniscalco is just one of many who think that the first commercial applications of quantum computing will be in speeding up or gaining better control over molecular reactions. If anything is going to give something useful in the next five years, it will be chemistry calculations, says Ronald de Wolf, senior researcher at CWI, a research institute for mathematics and computer science in Amsterdam. Thats because of the relatively low resource requirements, adds Shintaro Sato, head of the Quantum Laboratory at Fujitsu Research in Tokyo. This would be possible using quantum computers with a relatively small number of qubits, he says.

Financial applications, such as risk management, as well as materials science and logistics optimization also have a high chance of benefiting from quantum computation in the near term, says Biercuk. Still, no one is taking their eyes off the longer-term, more speculative applications including quantum versions of machine learning.

Machine-learning algorithms perform tasks such as image recognition by finding hidden structures and patterns in data, then creating mathematical models that allow the algorithm to recognize the same patterns in other data sets. Success typically involves vast numbers of parameters and voluminous amounts of training data. But with quantum versions of machine learning, the huge range of different states open to quantum particles means that the routines could require fewer parameters and much less training data.

In exploratory work with South Korean car manufacturer Hyundai, Jungsang Kim at Duke University in Durham, North Carolina, and researchers at the firm IonQ in College Park, Maryland, developed quantum machine-learning algorithms that can tell the difference between ten road signs in laboratory tests (see go.nature.com/42tt7nr). Their quantum-based model used just 60 parameters to achieve the same accuracy as a classical neural network using 59,000 parameters. We also need far fewer training iterations, Kim says. A model with 59,000 parameters requires at least 100,000 training data sets to train it. With quantum, your number of parameters is very small, so your training becomes extremely efficient as well.

Quantum machine learning is nowhere near being able to outperform classical algorithms, but there is room to explore, Kim says.

In the meantime, this era of quantum inferiority represents an opportunity to validate the performance of quantum algorithms and machines against classical computers, so that researchers can be sure about what they are delivering in the future, Garcia says. That is what will give us confidence when we start pushing past what is classically possible.

For most applications, that wont be any time soon. Silicon Quantum Computing, a Sydney-based start-up, has been working closely with finance and communications firms and anticipates many years to go before payday, says director Michelle Simmons, who is also a physicist at the University of New South Wales in Sydney.

Thats not a problem, Simmons adds: Silicon Quantum Computing has patient investors. So, too, does Riverlane, says Brierley. People do understand that this is a long-term play.

And despite all the hype, its a slow-moving one as well, Hensinger adds. Theres not going to be this one point when suddenly we have a rainbow coming out of our lab and all problems can be solved, he says. Instead, it will be a slow process of improvement, spurred on by fresh ideas for what to do with the machines and by clever coders developing new algorithms. Whats really important right now is to build a quantum-skilled workforce, he says.

Go here to see the original:
Quantum computers: what are they good for? - Nature.com