Archive for the ‘Quantum Computer’ Category

Taking Flight with Heron and Condor: The Latest Advancements in Quantum Computers – Securities.io

IBM has just announced the latest breakthrough in its mission to make commercialized and practical quantum computers a reality a 1,000+ qubit processor dubbed Condor' and an error-correction-focused processor dubbed Heron'.

Quantum computers represent a new approach to machine-based computation. Through the use of qubits capable of superposition and entanglement, quantum computers have the potential to perform faster and more complex calculations than classical bits used in more traditional computers. Unlike traditional computing, where bits represent either 0 or 1, qubits in quantum computing can represent both states simultaneously. Importantly, this makes quantum computing complementary to classical computing rather than a replacement; it excels in tasks like molecular simulations and system optimizations, while classical computing is better suited for everyday tasks.

It is because of the types of tasks that quantum computing should excel at that the technology is so vaunted. A computer capable of performing complex calculations orders of magnitudes quicker than its traditional counterparts is worth developing, as its use cases have the potential to change the world and our understanding of it.

With its announcement, IBM has made significant strides in quantum computing by launching two advanced quantum processors: Heron and Condor.

The Heron processor, featured on the ibm_torino quantum system, represents a leap forward with its 133 fixed-frequency qubits and tunable couplers, delivering a 3-5x improvement in performance compared to its previous 127-qubit Eagle processors. This advancement virtually eliminates cross-talk' (undesired interaction or interference between qubits) and lays the groundwork for future hardware development. Notably, IBM is already utilizing these chips in its modular-architecture' Quantum System Two computing platform.

On the other hand, the Condor processor, a 1,121 superconducting qubit quantum processor, is an equally notable innovation. It increases qubit density by 50%, incorporates advancements in qubit fabrication, and integrates over a mile of high-density cryogenic wiring within a single dilution refrigerator (a tool used to achieve extremely low temperatures, typically close to absolute zero). Condor's performance is comparable to the company's earlier 433-qubit Osprey processor, marking a significant milestone in scaling and informing future hardware design in quantum computing.

These developments by IBM are pivotal in pushing the boundaries of quantum utility and advancing toward quantum-centric supercomputing.

As previously mentioned, quantum computers are so vaunted due to their potential to greatly advance our understanding of just about every field of science. The following are just a few examples of these.

Medicine: In medicine, quantum computing could revolutionize drug discovery by simulating the behavior of molecules at a quantum level. This allows for more accurate predictions of how potential drugs might interact with the human body, speeding up the development of new medications and reducing costs.

Meteorology: For meteorology, quantum computers could analyze vast amounts of weather data more efficiently than classical computers. This would lead to more accurate weather predictions and better understanding of climate change, helping to mitigate natural disasters and plan agricultural strategies.

Complex Problem Solving: Quantum computing could tackle problems that are currently unsolvable by classical computers, such as optimizing large systems for logistics and supply chains, or solving intricate mathematical problems. This has broad implications for various sectors, including transportation, energy, and finance.

It is also important to recognize that we can not know what we cannot imagine. Meaning, there will be scores of unexpected advancements made possible through the abilities one day provided by this technology.

Quantum computing is the future of computing. It will open up new possibilities for scientific discovery and technological advancement that we can't even imagine today. Arvind Krishna, Chairman and CEO of IBM, in an interview with CNBC

With quantum computers representing such a monumental technological achievement, it should come as no surprise that there have been, and remain, significant hurdles and limitations that must be overcome in time. For example, quantum computing currently faces challenges in error correction, scalability, and developing practical algorithms.

In time, there are bound to be other hurdles that pop up, which were previously unexpected due to a rudimentary but growing understanding of quantum mechanics. The complexity and potential of quantum physics was emphasized in the following quote.

If you think you understand quantum mechanics, you don't understand quantum mechanics. Richard Feynman, Nobel laureate in Physics

As it stands, these limitations mean quantum computers are not yet ready for widespread use. With recent advancements, optimistic timelines point to another decade before this is the case.

In past decades, quantum computing seemed to be in such a distant future that courses teaching it were few and far between. Now that a future in which they are actually in use is beginning to come into focus, the need to train the next generation of scientists and engineers who will be responsible for continuing this advancement is only increasing. As a result, many universities are now offering specialized courses and programs in quantum computing to prepare a skilled workforce for this emerging field.

While the aforementioned schools may be training the next generation of quantum computing specialists, the following few companies are currently paving the road to this future.

IBM has long been a leader in the development of quantum computers. The company aims to democratize quantum computing development through initiatives like Qiskit Patterns. IBM has also expanded its roadmap for achieving large-scale, practical quantum computing, focusing on new modular architectures and networking that could enable quantum systems with hundreds of thousands of qubits, essential for practical quantum applications.

Microsoft's efforts in quantum computing are centered around cloud integration and collaboration. The company has introduced quantum machines with the highest quantum volumes in the industry to Azure Quantum, including partnerships with IonQ, Pasqal, Quantinuum, QCI, and Rigetti. This integration facilitates experimentation and is a step towards scaled quantum computing. Microsoft emphasizes the importance of a global ecosystem to realize the full potential of quantum computing and plans to deliver its quantum machine as a cloud service through Azure, ensuring secure and responsible use of this emerging technology.

Alphabet, through its Google Quantum AI lab, has made significant strides in quantum computing. In 2023, Google scientists announced a major milestone in reducing the rate of errors in quantum computing, a long-standing challenge in the field. Its research, published in the journal Nature, describes a system capable of significantly decreasing the error rate and implementing error-correcting codes that can detect and fix errors without compromising the information. Previously, in 2019, Google claimed to have achieved quantum supremacy with its Sycamore machine, performing a calculation in 200 seconds that would have taken a conventional supercomputer 10,000 years, demonstrating the potential of quantum computing in solving complex problems far beyond the capabilities of traditional computing.

Quantum computing represents a groundbreaking leap in the world of computing, offering the potential to revolutionize a plethora of fields. While IBM's recent advancements with the Heron and Condor quantum processors signify significant progress toward practical quantum computing, the technology continues to face significant challenges in error correction, scalability, and algorithm development highlighting the need for continued research and innovation.

While these challenges remain, quantum computing holds the promise of unlocking possibilities we can't even imagine today, ushering in a new era of scientific discovery and technological advancement. Its full potential is still unfolding, and its impact on various industries and society promises to be profound.

Read the original here:
Taking Flight with Heron and Condor: The Latest Advancements in Quantum Computers - Securities.io

IBM Reveals a Quantum Computing Breakthrough That Could Revolutionize Technology – The Messenger

IBM has revealed two new quantum computers that together represent a breakthrough in next-generation computing, and could ultimately change technology as we know it.

The company announced the computers today at the IBM Quantum Summit, which is ongoing in New York.

The IBM Quantum Heron is a series of high performance processors with the lowest error rate on any IBM quantum facility so far, while the IBM Quantum System Two is a modular supercomputing architecture. Both systems represent a significant milestone toward achieving the tech giant's ambition to develop next-generation quantum supercomputers in the next decade.

"We are firmly within the era in which quantum computers are being used as a tool to explore new frontiers of science," Dario Gil, IBM SVP and Director of Research, said in a statement.

Ultimately, IBM says the technology will be put "into the hands of our users and partners who will push the boundaries of more complex problems."

Quantum computing is a tech holy grail the idea is that being able to perform multitudes of computations all at once will radically transform scientific research and lead to rapid breakthroughs in drug discovery, climate forecasting, material science and more.

But first, IBM has to prove that quantum computers can match the high expectations both scientists and industry stakeholders have for them.

Its going to take a while before we go from scientific value to, lets say, business value, Jay Gambetta, IBMs vice-president of quantum told the Financial Times on Monday.

But in my opinion the difference between research and commercialization is getting tighter.

See the article here:
IBM Reveals a Quantum Computing Breakthrough That Could Revolutionize Technology - The Messenger

Rigetti Launches the Novera QPU, the Company’s 1st Commercially Available QPU – HPCwire

BERKELEY, Calif., Dec. 7, 2023 Rigetti Computing, Inc., a pioneer in full-stack quantum-classical computing, announced today the launch of its Novera QPU, a 9-qubit quantum processing unit (QPU) based on the Companys fourth generation Ankaa-class architecture featuring tunable couplers and a square lattice for denser connectivity and fast 2-qubit operations. The Novera QPU is manufactured in Rigettis Fab-1, the industrys first dedicated and integrated quantum device manufacturing facility.

The Novera QPU includes all of the hardware below the mixing chamber plate (MXC) of a dilution refrigerator. In addition to a 9-qubit chip with a 33 array of tunable transmons, the Novera QPU also includes a 5-qubit chip with no tunable couplers or qubit-qubit coupling which can be used for developing and characterizing single-qubit operations on a simpler circuit. In addition to the 9-qubit and 5-qubit chips, Novera QPU components include:

Our new Novera QPU enables hands-on access to our most innovative quantum computing technology. With the same architecture as our 84-qubit Ankaa systems, researchers working with the Novera QPU can have a head start in pursuing their quantum computing work and drive the industry forward, says Dr. Subodh Kulkarni, Rigetti CEO. Our Ankaa-class 9-qubit QPUs have already been commissioned by premier national labs, and now the same technology is available to those seeking to accelerate their own quantum computing work.

Fundamental research to gain a better understanding of how qubits operate, how to optimize control systems, testing how to design and characterize gates, ways to mitigate decoherence, and how to develop more efficient quantum algorithms are among the key focus areas for building higher quality quantum computers.

With the launch of the Novera QPU, quantum computing professionals and students can now have on-premise access to years of Rigettis internal R&D within a matter of weeks. Rigetti has been pioneering full-stack quantum computing technology for 10 years. This is an exciting moment for us to equip the quantum computing ecosystem with the same caliber of hardware and engineering that we use on our most powerful QPUs, says David Rivas, Rigetti CTO.

The Novera QPU implements universal, gate-based quantum computing and can be used by quantum software and algorithm experts to prototype and test: (1) hybrid quantum algorithms, (2) characterization, calibration, and error mitigation, and (3) quantum error correction (QEC) experiments.

Additionally, organizations looking to develop components of their quantum computing stack can leverage the Novera QPU to accelerate areas such as: (1) control electronics and software, (2) QEC decoders, (3) control optimization algorithms, (3) native gate architectures, and (4) measurement and calibration, and accompanying software.

The Novera QPU is designed to be integrated with commercially available dilution refrigerators and control systems.

The Novera QPU is available to order atrigetti.com/noverastarting at $900,000 and ships within 4-6 weeks after the order is confirmed and shipping and logistics are finalized.

About Rigetti

Rigetti is a pioneer in full-stack quantum computing. The Company has operated quantum computers over the cloud since 2017 and serves global enterprise, government, and research clients through its Rigetti Quantum Cloud Services platform. The Companys proprietary quantum-classical infrastructure provides high performance integration with public and private clouds for practical quantum computing. Rigetti has developed the industrys first multi-chip quantum processor for scalable quantum computing systems. The Company designs and manufactures its chips in-house at Fab-1, the industrys first dedicated and integrated quantum device manufacturing facility. Learn more atrigetti.com.

Source: Rigetti

Continued here:
Rigetti Launches the Novera QPU, the Company's 1st Commercially Available QPU - HPCwire

IBM quantum computing updates: System Two and Heron – The Verge

Today, Im talking with Jerry Chow. Hes the director of quantum systems at IBM, meaning hes trying to build the future one qubit at a time.

IBM made some announcements this week about its plans for the next 10 years of quantum computing: there are new chips, new computers, and new APIs. Youll hear us get into more of the details as we go, but the important thing to know upfront is that quantum computers could have theoretically incredible amounts of processing power and could entirely revolutionize the way we think of computers if, that is, someone can build one thats actually useful.

Heres Jerry, explaining the basics of what a quantum computer is:

A quantum computer is basically a fundamentally different way of computing. It relies on the laws of quantum mechanics, but it just changes how information is handled. So instead of using bits, we have quantum bits or qubits.

A regular computer the quantum folks call them classical computers like an iPhone or a laptop or even a fancy Nvidia GPU works by encoding data in bits. Bits basically have two states, which we call zero and one. Theyre on or theyre off.

But the laws of quantum mechanics that Jerry just mentioned mean that qubits behave very, very differently. They can be zero or one, but they might also be a whole lot of things in between.

You still have two states: a zero and a one. But they can also be in superpositions of zero and one, which means that theres a probability that when you measure it, it will be zero or one with particular probability. In terms of how we physically build these, theyre not switches anymore, theyre not transistors, but theyre actually elements that have quantum mechanical behavior.

One of my favorite things about all this is that in order to make these new quantum computers work, you have to cool them to within fractions of a degree of absolute zero, which means a lot of companies have had to work very hard on cryogenic cooling systems just so other people could work on quantum chips. Jerry calls early quantum computers science projects, but his goal is to engineer actual products people can use.

Youll hear Jerry talk about making a useful quantum computer in terms of utility, which is when quantum computers start to push against the limits of what regular computers can simulate. IBM has been chasing after utility for a while now. It first made quantum computers available on the cloud in 2016, its shipped System One quantum computers to partners around the world, and now, this week, its announcing System Two along with a roadmap for the future. Its Decoder, so I asked Jerry exactly how he and his team sit down and build a roadmap for the next 10 years of applied research in a field that requires major breakthroughs at every level of the product. Oh, and we talked about Ant-Man.

Its a fun one very few people sit at the bleeding edge all day like Jerry.

Okay. Jerry Chow, director of quantum systems at IBM. Here we go.

This transcript has been lightly edited for length and clarity.

Jerry Chow, you are an IBM fellow and director of quantum systems. Welcome to Decoder.

Im really excited to talk to you. Theres quite a lot to talk about quantum computing in general, where it is. But youve got some news to announce today, so I want to make sure we talk about the news right off the bat. What is going on in IBM Quantum?

Listen to Decoder, a show hosted by The Verges Nilay Patel about big ideas and other problems.Subscribehere!

Yeah, so we have our annual Quantum Summit coming up, where we basically invite our network of members and users to come, and we talk about some of the really exciting news. What were announcing this year is actually we have a really exciting upgraded quantum processor that were talking about. Its called the IBM Quantum Heron. It has 133 qubits. Its the highest performance processor that weve ever built, and its going to be available for users to access via our cloud services.

Were also launching IBM Quantum System Two and introducing this as a new architecture for scaling our quantum computers into the future. Were also talking about a 10-year roadmap looking ahead. We, at IBM Quantum, like to sort of call our shots, tell everyone what were doing because that keeps us honest, keeps everyone in the industry on the same benchmark of seeing what progress is. And were expanding that roadmap, which we actually first introduced a couple of years ago and have hit all our milestones thus far. But we are extending it out to 2033, pushing forward into this next realm where we really want to drive toward pushing quantum computing at scale.

So youve got a new processor, youve got a new computing architecture in System Two, youve got a longer roadmap. Put that in context for me: weve been hearing about quantum computing for quite a long time. I have stared at a number of quantum computers and been told, This is the coldest piece of the universe that has ever existed. Its been very entertaining, at the very least. Were only now at the point where were actually solving real problems with quantum computers.

Were not even at the point of solving real problems.

Not yet. But we are, really excitingly, just this past year, at the point where were calling this utility-scale quantum computing. Were using 100-plus qubits. We used a processor earlier in the year called Eagle, where we were able to look at a particular problem that you couldnt really solve with brute-force methods using a classical computer, but also it challenged the best classical approximation methods that are used on high-performance computing. So whats interesting there is that now the quantum computer becomes like the benchmark. You almost need it to verify whether your approximate classical methods are working properly. And that just happens when you go over 100 qubits.

At 100 qubits, things all change so that you just cant use, say, GPUs or any kind of classical computers to simulate whats going on accurately. This is why were in this phase where we call it utility scale because theres going to be this back and forth between using a quantum as a tool compared with what you can still potentially do in classical. But then theres a long road there that were going to try to drive value using the quantum to get toward quantum manage.

I think the word utility there threw me off. This is the branch point where the problems you solve with a quantum computer start to become meaningfully different than the problems you could solve with a regular computer.

Thats right. We see this really as an inflection point. There are a lot of industries that use high-performance computation already, and they are looking at very, very hard problems that use the Oak Ridge supercomputers and whatnot. And now quantum becomes an additional tool that opens up a new lens for them to look at a different area of compute space that they werent able to look at before.

So IBM has a huge program in quantum. The other big companies do, too Microsoft, Google, what have you, theyre all investing in this space. Does this feel like a classical capitalist competition, Were all racing forward to get the first product to market? Is it a bunch of researchers who know that theres probably a pot of gold at the end of this rainbow, but were nowhere close to it yet, so were all kind of friendly? Whats the vibe?

Id say that its a very exciting time to be in this field. How often do you get to say youre building from the ground floor of a completely new computational architecture? Something that is just fundamentally different from traditional classical computing. And so yeah, Id say that theres certainly a lot of groundswell, theres a lot of buzz. Sometimes a little too much buzz, maybe. But also I think from the perspective of competition, it helps drive the industry forward.

We, at IBM, have been at the forefront of computation for decades. And so its in our blood. The ideas of roadmaps and pushing the next big development, the next big innovations in computation, have always been something that is just native to IBM, and quantum is no different. Weve been in the game with quantum since the early theoretical foundings for probably 30 years, 30-plus years. But now were really starting to bear a lot of that fruit in terms of building the architectures, building the systems, putting out the hardware, developing the framework for how to make it usable and accessible.

Let me give you just a much dumber comparison. We had the CEO of AWS on the show, Adam Selipsky. AWS is furiously competitive with Microsoft Azure and Google Cloud. They are trying to take market share from each other, and they do a lot of innovative things to make better products, but the end goal of that is taking one customer away from Google. Youre not there yet, right? Theres not market share to be moved around yet?

Certainly not at that scale.

But are there quantum customers that you compete for?

Theres certainly a growing quantum community.

[Laughs] Its not a customer; there are people who are interested.

At 100 qubits, things all change

There are people that are interested across the board, from developers, to students, to Fortune 500 companies. We have a lot of interest. So just as an example, we first put systems on the cloud in 2016. We put a very simple five-qubit computer, five-qubit quantum computer, on the cloud. But it reflected a real fundamental shift in how quantum could be approached. Before, you had to be sort of a physicist. You had to be in a laboratory turning knobs. Youre taking data, youre running physicist code; youre not programming a computer.

Wow. [Laughs] Shout out to physicists.

Well, Im a physicist, and you dont want to see my code. [Laughs] But the whole point is that we developed a whole framework around it to actually deploy it and to make it programmable. And think about the early days of computers and all the infrastructure you needed to build in terms of the right assembly language and compilers and the application layers all above that. Weve been building that for the last seven years since that first launched. And in that time, weve had over 500,000 users of our platform and of our services.

Im always curious how things are structured and how decisions are made. Thats really what we talk about on the show. And theres a forcing function that comes when its a business, and theres a growth path. Quantum seems very much like one day it will be a huge business because it will solve problems that regular computers cant. But right now, its on the very early part of the curve where youre investing a lot into R&D, on an aggressive roadmap, but youre nowhere close to the business yet.

I would say that were knocking on the door of business value and looking for that business value, because especially when were in this realm where we know that it can be used as a tool pitted against the best classical computers, theres something there to be explored. A lot of times, even with traditional computers, there are very few proven algorithms that are where we drive all the value. A lot of the value that gets driven is done through heuristics, through just trial and error, through having the tool and using it on a particular problem. Thats why we see this fundamental game-changer of this inflection point going toward utility scale systems of over 100 qubits as now this is the tool that we want users to actually go and find business advantage, find the problems that map appropriately onto these systems for exploration.

So put that in the context of IBM. IBMs a huge company, its over 100 years old, it does a lot of things. This is probably the most cutting-edge thing IBM is doing, I imagine. Im guessing youre not going to disagree with me. But it feels like the most cutting-edge thing that most of the Big Tech companies are doing.

How is that structured inside of IBM? How does that work?

So were IBM Quantum within IBM Research. IBM Research has always been the organic growth engine for all of IBM. Its where a lot of the innovative ideas come in, but overall, a particular strategy within IBM and IBM Research is that were not just doing research and then were going to do development and then its going to go on this very linearized product journey. Its all integrated together as we are moving forward. And so therefore, we have the opportunity within IBM Quantum that were developing products, were putting it on the cloud, were integrating with IBM Cloud. Were actually pushing these things forward to build that user base, build that groundswell, before all the various different technology elements are finished. Thats sort of this agile methodology of building this from the ground up, but also getting it out early and often to drive excitement and to really build up the other parts of the ecosystem.

So how is IBM Quantum structured? How many people is it? How is it organized?

So we dont speak explicit numbers, but we have several hundred people. And then we have parts of the team which are focused on the actual hardware elements, all the way down to the actual quantum processor and the system around it in terms of making those processors function by cooling it down in the cryogenic system, talking to it with control electronics, talking to it with classical computing. So it all needs to tie together.

Then you have software development teams. We also have a cloud and services team that helps to deliver our offerings as a service. And then we have applications teams looking at the next algorithms, the next novel ways of making use of our quantum services. We also have teams that are more outward-looking for business development trying to drive adoption, working with various clients to engage in the problems of their interests. We also have a part of our team which runs an offering called the Quantum Accelerator. Its like a consulting arm, working with the clients to get quantum-ready, start understanding how their problems can be impacted by quantum computing and start using our systems.

Is that all flat? Every one of those teams reports to you, or is there structure in between?

No, so all those different ones report to our vice president of quantum computing, which is Jay Gambetta. I take care of the systems part. Basically, the wrapping of the processor and how it runs, executing problems for the users, thats the piece that I own.

Theres a tension there. It sounds like IBM is designed to attack this tension head-on, which is: Were doing a bunch of pure research in cryogenics to make sure that quantum computing can run because it has to be really cold to run. Then theres a business development team thats just off and running, doing sales stuff, and at some point theyre going to come back and say, We sold this thing. And the cryogenics team is going to say, Not yet. Every business has a problem like that. When youre in pure research mode, the not yet is a real problem.

How often do you run into that?

We have a very good strategy across the team. We know our core services and what the core product we have is. And also we have a roadmap. The concept of the roadmap is both great for the R&D side but also great for the client perspective, business development angle view of seeing whats coming next. From the internal side, we know weve got to continue to drive toward this, and these are our deliverables and these are the new innovations that we need to do. In fact, in our new roadmap that were releasing, we have that separated. Both a development roadmap, which is more product focused and more like what the end users going to get and clients going to get. And we have an innovation roadmap to show those things which were still going to need to turn to crank and figure out what feeds in.

I often say the roadmap is our mantra, and it really is our calling card both internally and externally. Not many people really show a lot of detail in their roadmap, but it serves as a guiding tool for us all.

I was looking at that roadmap, and it is very aggressive. Were at Heron, there are many birds to come from what I understand. And the goal is that a truly functional quantum computer needs thousands or millions of qubits, right?

We have a transition toward what we are calling quantum at scale, which I think what youre referring to is when you will get to the point where you can run quantum error correction, correct for all the errors that are underlying within these qubits, which are noisy. People throw around that number millions of qubits in a way that almost drives fear into the hearts of people. One actually really exciting thing that weve done this past year is weve developed a set of novel error correction codes that brings down that resource count a lot.

So actually, youll need potentially hundreds of thousands of qubits, 100,000 qubits or so, to build a fault-tolerant quantum error-correction-based quantum computer of a particular size to do some of those problems that were talking about at scale. And thats part of the roadmap, too. So thats what were looking at further to the Blue Jay system in 2033. So theres certainly a number of birds to get there, but we have concrete ideas for the technological hurdles to overcome to get there.

Thats the goal. Youre going to get to some massively larger scale than you are today. Orders of magnitude. Today the chip has 133 qubits, you need to get to thousands. Some people, terrifyingly, are saying millions.

Part of your strategy is linking the chips together into these more modular systems and then putting control circuitry around them. Im a person who came up in what you might call the classical computing environment, thats very familiar. Thats a very familiar strategy; were just going to do more cores. Thats what that looks like to me. Lots of companies have run up against a lot of problems here. In that part of the world, theres just Moores law, and we sit around talking about it all day long. And Nvidia and maybe TSMC have gotten over it this time, and Intel has struggled to get the next process node and increase the transistor density. Is there an equivalent to Moores law in quantum that you were thinking about?

Our roadmap is showing that type of progression.

I look at that roadmap, and you are definitely assuming a number of breakthroughs along the way in a way that Intel just assumed it for years and years and they achieved it, and then kind of hit the end of the road.

Even where we are today with Heron, and actually complementary to Heron this year, we also already built a 1,000-qubit processor, Condor. Its explicit goal was to push the limits of how many qubits could we put on a single chip, push the limits of how much architecture could we put in an entire system. How much could we actually cool down in the dilution refrigerators that we know today, the cryogenic refrigerators that we have today? Push the boundaries of everything to understand where things break. And if you look at the early part of our roadmap, the birds are there with various technological hurdles that weve already overcome to get toward this thousand-qubit level. And now those next birds that you see in the rest of the innovation roadmap are different types of couplers, different types of technologies, that are those technological hurdles, like in semiconductors, that allow us to bridge the gap.

Are they the same? Is it the same kind of, We need to double transistor density, or is it a different set of challenges?

Id say, the decades of experience matter

Theyre different, because with this sort of modular approach, theres some that are like, how many can we place into a single chip? How many can we place into a single package? How many can we package together within the system? So they all require slightly different technological innovations within the whole value chain. But we dont see them as not doable; we see them certainly as things that we will handle over the next few years. Were already starting to test linking between two packages via a cryogenic cable. This is toward our Flamingo demonstration, which were planning for next year.

Do you get to leverage any of the things that are happening on the process side with classical computers?

Like TSMC hits three nanometers and you get to pull that forward, or is that different?

Not so explicitly to the newest stuff thats happening today in semiconductors. But IBM has been in the semiconductors game for many, many decades. And a lot of the work that weve achieved with even achieving a 100 qubits with Eagle a couple of years ago was because we had that deep-rooted semiconductor background. So just to give you an example, at 100 qubits, the challenge is how do you actually wire to 100 qubits in a chip? The standard thing you do in semiconductors is you go to more layers, but its not so easy to do that just in these superconducting quantum circuits because they might mess up the qubits. It might cause them to decohere.

But because of our know-how with packaging, we found the right materials, we found the right way of using our fabrication techniques to implement that type of multilayer wiring and still talk to these 100 qubits. We evolved that further this past year to actually get to 1,000. And so that type of semiconductor know-how is just ingrained and something that is, Id say, the decades of experience matter.

So youre going to build the next-generation quantum computing chip, Heron. Its got 133 qubits. How is that chip manufactured?

Okay. Well, to build the next-generation quantum computing chip, we rely on advanced packaging techniques that involve multiple layers of superconducting metal to package and to wire up various superconducting qubits. With Heron, were also using a novel tunable coupler architecture, which allows us to have world-record performing two-qubit gate qualities. And all this is done in a standard fabrication facility that we have at IBM and package up this chip, and we have to cool it down into a cryogenic environment.

So silicon goes in one side of the building, Heron comes out the other?

I mean, certainly more steps than that. [Laughs] And theres this know-how of how to do it properly to have high-performing qubits, which weve just built up.

Explain to me what a high-performing qubit is.

Yeah, so the tricky thing with these qubits There are different ways of building qubits. There are people who use ions and atoms and electrons and things like that, but ours are actually just metal on a substrate; theyre circuits. Theyre much like the circuits that you might see when you look inside of a standard chip. But the problem with these circuits is that you can build, so you can basically arrange them in a certain way and use the right materials. And you have a qubit that, in this case, for superconducting qubits, it resonates at five gigahertz.

If you choose the wrong materials, the lifetimes of these qubits can be extremely short. So when we first started in the field of building superconducting qubits in 1999, superconducting qubits lasted for maybe like two nanoseconds, five nanoseconds. Today, weve gotten up to close to a millisecond, hundreds of microseconds to a millisecond. Already in numbers orders of magnitude longer. But that took many years of development. And at the point of a few hundred microseconds, were able to do all these complex operations that weve been talking about to push this utility scale that we discussed earlier. So that know-how to increase that lifetime comes down to engineering, comes down to understanding the core pieces that generate loss in the materials, and thats something that we certainly have expertise at.

Tell me about the industry at large. So IBM has one approach: you said youre using metals on a substrate. Youre leveraging all of the semiconductor know-how that IBM has. When youre out in the market and youre looking at all your competitors, Microsoft is doing something else, Google something else. Go through the list for me. What are the approaches, and how do you think theyre going?

When we think about competitors, you can think about the platform competitors of whos building the services, but I think what youre pointing to more is the hardware side.

When it comes down to it, theres a simple set of metrics for you to compare the performance of the quantum processors. Its scale: what number of qubits can you get to and build reliably? Quality: how long do those qubits live for you to perform operations and calculations on? And speed: how quickly can you actually run executions and problems through these quantum processors? And that speed part is something where its an interplay between your quantum processor and your classical computing infrastructure because they talk to one another. You dont control a quantum computer without a classical computer. And so you need to be able to get your data in, data out and process it on the classical side.

So scale, quality, speed. Our approach with superconducting qubits, to the best of our knowledge, we can hit all three of those in a very strong way. Scale, pushed up to over 1,000 qubits. We know that we can build up to 1,000 qubits already with the technologies that weve built. From the quality, Heron which were releasing has the best gate quality. So the gates, the operations, the gate qualities that have been shown across a large device. And then speed, in terms of just the execution times, were on the order of microseconds for some of the clock rates, whereas other approaches can be a thousand orders of magnitude slower.

What are the other approaches in the industry that you see, and where are they beating you and where are you ahead?

So there are trapped ions: basically theyre using molecular ions like caesium and things that you might use for clocks, atomic clocks. They can have very good quality. In fact, there are some results that have tremendous performance across a number of those types of trapped-ion qubits in terms of their two-qubit gate qualities. But theyre slow. In terms of the clock rates of getting your operations in, getting your operations out, you do operations to recycle the ion sometimes. And thats where it, Id say, has a downside.

Id say, right now, superconducting qubits and trapped ions are the approaches that have the most prominence at the moment that have been put out in terms of usable services. Atoms have also emerged; its very similar to the trapped ions. There, they use these fun little things called optical tweezers to hold atoms into little arrays. And there are some exciting results that have been coming out from various atom groups there. But again, it comes down to that speed. Anytime you have these actual atomic items, either an ion or an atom, your clock rates end up hurting you.

Alright, let me make a comparison to semiconductors again. So in semiconductors there was multiple pattern lithography that everyone chased for a minute, and it hit an end state. And then TSMC had bet really big on EUV and that let them push ahead. And Intel had to make a big shift over there. Youre looking at your roadmap, youre doing superconductors, cryogenics, metals on substrates, and over here some guys are doing optical tweezers on atoms. Is there a thought in your head like, We better keep an eye on this because that might be the process innovation that we actually need?

I think overall, in the landscape, were always keeping track of whats going on. Youre always seeing what are the latest innovations in the various different technologies.

Is that even a good comparison to semiconductors in that way?

The whole systems are completely different. The architectures are not so compatible. At some level, with your nodes of your semiconductors, there might be certain kinds of know-how that translate how you route and layout, maybe. And here, above a certain layer, theres also going to be commonality in terms of the compute platform, how the quantum circuits are generated. The software layers might be similar, but the actual physical hardware are very different.

It feels like the thing were talking about is how do you make a qubit? And its not settled yet. You have an approach that youre very confident in, but theres not a winner in the market.

I mean, were pretty confident. Were pretty confident in superconducting qubits.

Fair enough. [Laughs] I was just wondering.

Its why were able to prognosticate 10 years forward, that we see the direction were going. And to me its more that there are going to be innovations within that are going to continue to compound over those 10 years, that might make it even more attractive as time goes on. And thats just the nature of technology.

Youve got to make decisions on maybe the longest timeline of anyone Ive had on the show. Its always the chip people who have the longest timelines. I talk to social media CEOs, and its like their timeline is like five minutes from now, like, What are we going to ban today? Thats the timeline. I talk to chip folks, and your timelines are decades. You just casually mentioned a chip youre going to ship in 2033. Thats a long time from now. How do you make decisions on that kind of timeline?

Theres the near-term stuff, obviously, and the roadmap serves as that guide. That roadmap is constructed so that all these various things do impact that long-term delivery.

Just walk me through: What does the quantum computing roadmap meeting look like? Youre all in a conference room, are you at the whiteboard? Paint the picture for me.

Its mainly an inertia thing to move entire industries, move banks, move commerce, to adopt those standards

Yeah, that is a great question. I mean, we have a number of us who are sitting there. We certainly know that we have certain types of technical foundations that we know that we need to include into these next-generation chips and systems.

For this roadmap, we said, We know at some point we need to get quantum error correction into our roadmap. And with that technical lead, we know what are the requirements? So first we said, Okay, lets put it here. Now lets work backward. It says that we need to do this innovation and this innovation by this date, and this other innovation in the software stack or whatever by this date. And then we say, Oh shoot, we ran out of time. Lets move back a little bit. And so we do a little bit of that planning, because we also want to do it so that we lay out this roadmap that we often call no-regrets decisions. We dont want to do things that are just for the near term. We want to really pick strategies that give us this long-term path.

Its why we talk about utility scale so much in terms of what we can do with Herons and soon Flamingos. But everything that we want to build on top of what we can do there will translate to what we can do when we get those systems at scale, including error correction. And in terms of the roadmap planning Were not done, by the way. We have this overall framework for the 10-year roadmap, and then we need to refine. Weve got a lot of details still to come to work on in terms of what are those things that need to be worked on across the software layer, the compiler layer, the control electronics layer, and certainly at the processor layer.

Is there commercial pressure on this? Again, this is a lot of cost at a big public company. Is the CEO of IBM in that room saying, Whens this going to make money? Move it up?

I think the point is, our mission is to bring useful quantum computing to the world. Ive been working in this area for 20 years now. Weve never been this close to being able to build something that is driving real value. And so I think when you look at our team, we are all aligned along that mission. That we want to drive this to something that We started with just getting it out there in the cloud in terms of building the community. Now, we fundamentally see this as a tool that will alter how users are going to perform computation. And so there has to be, and I expect there to be, value there. And weve seen how the HPC community has progressed and weve seen how supercomputing has... You could see whats happening with the uptake of AI and everything. We build it, we will build the community around it, well drive value.

Lets talk about AI for a second. This is a really good example of this. AI demand is through the roof. The industry is hot. Well see if the products are long lasting, but there seems to be real consumer demand for them. And that is all translated into a lot of people want a lot of Nvidia H100 chips. Its very narrowly focused on one kind of processor. Do you see quantum systems coming into that zone where were going to run a lot of AI workloads on them? Like future AI workloads.

Whats happened in AI is phenomenal, but were not at the point where the quantum computer is this commodity item that were just buying tons of chips. Youre not fabricating millions of these chips. But we are going to build this supercomputer based off of quantum computing, which is going to be exquisitely good at certain types of tasks. And so the framework that I actually see is already youre going to have your AI compute clusters. The way that people run workloads today, Im sure they are running some parts on their regular computers, on their own laptops, but parts of the job get fed out to the cloud, to their hyperscalers, and some of them are going to use the AI compute nodes.

We see that also for how quantum will feed in. Itll be another part of that overall cloud access landscape where youre going to take a problem, youre going to break it down. Youre going to have parts of it that run on classical computing, parts of it that might run on AI, parts of it that will leverage what we call quantum-centric supercomputing. Thats the best place to solve that one part of the problem. Then it comes back, and youve got to stitch all that together. So from the IBM perspective, where we often talk about hybrid cloud, thats the hybrid cloud that connects all these pieces together. And differentiation is there in terms of building this quantum-centric supercomputer within there.

So your quantum-centric supercomputer in the cloud. Weve talked a lot about superconducting now. You need a data center thats very cold. This does not seem like a thing thats going to happen locally, for me, ever, unless LK-99 is real. This isnt going to happen for anyone in their home outside of an IBM data center for quite some time.

I would say this. So when I was first working in this area and did my PhD in this area I worked on superconducting qubits we required these large canisters, these refrigerators, where we need to wheel up these huge jugs of liquid helium and fill them every three days to keep them cold. Now, thats a physics experiment. I mean, there have already been innovations in cryogenics that theyre turnkey: you plug them in, they stay running, they can run for years and maintain your payloads at the right temperatures. Youre paying electricity, obviously, to keep them cold. But were seeing innovations there, too, in terms of driving infrastructure-scale cryogenics. Honestly, were going to evolve the data center of the future, just like data centers today have evolved to handle increased compute resources needed. We will work hand in hand with how to build these quantum data centers, and were already doing that. So we have a quantum data center up in Poughkeepsie, which hosts the majority of our systems, and were planning on expanding that further.

More here:
IBM quantum computing updates: System Two and Heron - The Verge

Has IBM cracked the code of quantum computing by solving data errors? – Euronews

One of the main issues in developing the machines is they often struggle with data errors. But IBM says its chips could make a difference.

Technology giant IBM has reached a major milestone in its quantum ambitions and has unveiled a new chip and machine that it hopes can help solve problems beyond the scope of traditional computers.

The unveiling at an IBM event in New York on Monday comes as companies and countries race to develop quantum machines, which can carry out large numbers of calculations simultaneously and at incredible speeds.

The new chip has more than 1,000 qubits, which is the equivalent of the digital bits in an ordinary computer.

One of the main issues in developing the machines is they often struggle with data errors. However, IBM said it has a new method to connect chips inside machines which can then connect machines and with a new error-code connection could produce even more capable quantum machines in 10 years.

The first machine to use them is called Quantum System Two, which uses three so-called "Heron" chips.

"We are firmly within the era in which quantum computers are being used as a tool to explore new frontiers of science," said Dario Gil, IBMs senior vice president and director of research.

"As we continue to advance how quantum systems can scale and deliver value through modular architectures, we will further increase the quality of a utility-scale quantum technology stack and put it into the hands of our users and partners who will push the boundaries of more complex problems".

IBM did not predict when it could go commercial with quantum machines.

At the annual IBM Quantum Summit, the company also unveiled 10 projects that showed off the potential power of quantum computing, such as for drug discovery.

The scale-up Algorithmiq, which is developing quantum algorithms to solve problems in life sciences, was one of them and successfully ran one of the largest scale error mitigation experiments to date on IBMs hardware. It said the achievement positions them alongside IBM as front runners to reach quantum utility, referring to quantum computer's ability to perform reliable computations beyond the capabilities of regular computing methods, for real-world use cases.

Today represents further validation that Algorithmiqs core error mitigation techniques are powerful and will enable large-scale experiments on specific use cases leading us well into the quantum utility era for real commercial applications, said Sabrina Maniscalco, co-founder and CEO of Algorithmiq.

Ive dedicated over 20 years of my life to the study of noisy quantum systems, as a professor, and I never thought this type of experiment would be possible so soon, she said in comments to Euronews Next.

Additionally, IBM is pioneering the use of generative AI for quantum code programming IBM's enterprise AI platform watsonx.

"Generative AI and quantum computing are both reaching an inflection point, presenting us with the opportunity to use the trusted foundation model framework of watsonx to simplify how quantum algorithms can be built for utility-scale exploration," said Jay Gambetta, Vice President and IBM Fellow at IBM.

"This is a significant step towards broadening how quantum computing can be accessed and put in the hands of users as an instrument for scientific exploration".

Read more here:
Has IBM cracked the code of quantum computing by solving data errors? - Euronews