The Current State of Quantum Computing – Securities.io

Quantum Computing Is Different

Quantum computing is the idea of using quantum physics to perform calculations, which differs from normal semiconductor-based computing methods. Instead of generating 0 and 1 (no current or current), it uses quantum bits, called qubits, where particle data is either 0 AND 1 at once, or 1, or 0.

Because of the fundamental difference in the way of calculus, quantum computing is not so much an alternative to normal computing but rather a complement.

Standard computing works in a linear fashion and struggles with very complex calculations, like climate modeling, cryptography, or the 3D configuration of complex molecules like proteins. And this is precisely the type of calculation that quantum computing is expected to excel at.

So, while our laptops and smartphones are likely to never be quantum computers, they could revolutionize scientific research.

So, with the promise that quantum supercomputers will perform a thousand times better than the existing ones, it is not a surprise that plenty of research has been done to make them a reality.

But the problem is that creating even one qubit is technically very difficult. The first difficulty is that quantum computing only works at ultra-low temperatures, around a hundred degrees above absolute zero. Only in these conditions are some unique materials turning into superconductors (materials with no electric resistance). This is energy-consuming, expensive, and difficult to achieve.

And then, managing to control, manipulate, and read the data in a qubit is also complex, usually involving ultra-precise lasers, atomic microscopes, and sensors. Lastly, any interference will make the qubit useless, so a perfect vacuum needs to be achieved as well.

While semiconductor chips manipulate matter at scales measuring only a few atoms', quantum computing is looking to handle the matter at the particle scale. Notably, a practical quantum computer will require thousands of qubits to stay stable and interact with each other.

A team headed by Professor Gerhard Birkl from the Atoms Photons Quanta research group in the Department of Physics at TU Darmstadtin Germanyhas just created the largest quantum computer yet.

They have created a quantum computer with 1,000 individually controllable atomic qubits, winning a race in the field against many other scientific teams.

The 1,000 mark is partially symbolic but also around the number expected to be required for meaningful application of quantum computers. Less than that, they are mostly a scientific curiosity and a promising idea, but not much more.

The technique uses optical tweezers, which are special lasers able to manipulate the atoms individually. Thanks to progress in micro-optics, this is the most promising technique in quantum computing for a scalable method to build much bigger systems.

As the number of lenslets per square centimeter readily reaches 100,000 and MLA wafers with areas of several 100 square centimeters can be produced, they have enormous potential in terms of scalability, only limited by the available laser power

Source: Optica

By perfecting the usage of such optical tweezers, Prof. Birkl has demonstrated that large quantum computers, with thousands of qubits, can be engineered. This, in turn, will give the essential tool needed by other researchers to perform quantum computations.

Many problems physicists struggle with today are linked to particle behavior at the quantum scale, or at least as soon as more than 30 particles are simulated. This is a problem as ordinary computing systems struggle with the probabilistic behavior of particles and quantum physics in general.

To solve this issue, the ideal situation would be to develop a quantum simulator where qubits can simulate the behavior of quantum particles. This is because qubits use themselves the quantum properties of entanglement and superposition, which are the parts so hard to simulate in a normal computer.

While quantum simulators are essentially a special type of quantum computer, the issue so far has been to make them able to simulate many different particles instead of having to custom design a quantum simulator for each specific physical question.

Natalia Chepiga and her research group, assistant professor at Delft University of Technology in the Netherlands, might have found a solution.

She proposes a protocol that creates a fully controllable quantum simulator in a scientific paper published in Physical Review Letters. This works by using two lasers with different frequencies or colors, adding an extra dimension to the calculation. Theoretically, this method could be expanded to add more than 2 dimensions to the quantum simulator calculus.

This type of quantum simulator could be a major boost in plenty of research efforts at the very edge of our current knowledge, including ultra-cold physics (including superconductors), semiconductors, material sciences, telecommunications, and energy technologies (especially batteries).

Most quantum computing designs are focused on qubits, and making them more easy to manipulate/program and to add more of them. An alternative is using quantum digits, or qudits.

Aquantum computer withxqubits can perform 2xcalculations. However, a machine withxnumber ofqudits, with D representing the number of states per qudit, can perform Dx number of calculations.

This means you can encode the same information in fewer quantum particles when using qudits,

Martin Ringbauer, a quantum physicist at the University of Innsbruck in Austriain IEEE Spectrum

In simpler terms, the more D dimensions to a quantum computing system, the more it is becoming exponentially powerful. In addition to this more efficient calculation using qudits instead of qubits, they are expected to be more reliable and less likely to cause calculation error than qubits.

So it is big news that a team of researchers led by Andrea Morelloat the USNW in Australia has createda 16-dimension, highly controllable qudit computing system. With D=16, any quantity of qudits added to the system increases the computing capacity by a power 16.

To achieve this, they used a 123Sb (antimony) donor atom, which was ion-implanted in a silicon nanoelectronic device.

The combined Hilbert space of the atom spans 16 dimensions, and can be accessed using both electric and magnetic control fields. Andrea Morello

This system achieved remarkable results; notably, the nuclear spin already shows gate fidelities exceeding 99% regardless of the drive mechanism. The antimony atom is also an improvement over the previously used 31P (phosphorus), as antimony is a heavier atom and is easier to manipulate.

This technical and scientific achievement is also further improving, notably using isotopically purified 28Si (silicon), removing residual 29Si concentration, and improving the system's reliability (coherence times and gate fidelities).

The field is still very much in its infancy, with whole new concepts still emerging, like usable qudits or programmable quantum simulators.

Combined with the progress in creating 1,000+ qubit systems, this shows that quantum computing will likely be a very important scientific field in the upcoming decades, with tremendous untapped potential.

Currently, research in material science or biochemistry is being boosted by AI, something we discussed in our article Disruptive Industries Coalescing Around a Core Technology Artificial Intelligence (AI).

But soon, in the next 5-10 years, we might start seeing practical results of quantum computing calculations. The hardware is now moving from thought experiments and lab demonstrators to prototypes of commercial research computers.

The next step will be developing software that can maximize the potential of quantum computingand starting to produce at-scale quantum computers to decrease costs and provide some standardization.

So, in many ways, quantum computing is at the stage where the first commercial computer mainframes were coming out in the 1950s and 1960s before becoming a common business and research tool in the following decades.

While hard to fully predict, we already know a few segments that will benefit greatly from quantum computing becoming more widely available:

International Business Machines Corporation (IBM) was the leading force behind the commercialization of the first mainframe computer. However, it has fallen behind other tech giants like Apple, TSMC, and NVIDIA.

It is, however, at the forefront of the development of quantum computers. For example, it developed its 127-qubit Eagle quantum computer, which was followed by a 433-qubit system known as Osprey.

And this is now followed by Condor, a 1,121 superconducting qubit quantum processorbased on cross-resonance gate technology, together with Heron, a quantum processor at the very edge of the field.

Finally, IBM released Qiskit 1.0 in February 2024, the most popular quantum computing SDK, with improvements in circuit construction, compilation times, and memory consumption compared to earlier releases.

Looking forward, IBM has already announced its next major goal in anticipation of its current quantum chips outgrowing' the currently used infrastructure. This goal is known as IBM Quantum System Two'; a modular system that has the potential to support up to 16,632 qubits.

IBM's strength has always been since its inception in developing ultra-powerful supercomputers, a segment of the market overshadowed by the rise of consumer electronics and standardized chips. The emergence of quantum computing is an occasion for IBM to shine again and become a leader in this upcoming important segment of computing for scientific research and large corporation computing needs.

Already a leader in normal cloud services, Microsoft is a pioneer in offering quantum computing cloud services withAzure Quantum. It is entirely possible that most quantum computing in the future will be done by researchers remotely, relying on cloud services like Microsoft's, instead of direct access to their own quantum computer.

This is especially likely as, ultimately, most of the quantum computing applications will be researched by biochemists, material science experts, climate scientists, and other specialists with no specific background in quantum computing. So relying upon dedicated professionals working at firms like IBM, Microsoft, or Google to handle the computing part makes more sense than hiring or training people strangers to the field.

The service can also offer hybrid computing, mixing quantum computing with traditional cloud-based supercomputer service.

Instead of vertical integration, Microsoft's approach to quantum computing has been to establish partnerships with leaders in the field covering virtually all the technologies possible to achieve quantum computing, like IonQ(IONQ), Pasqal, Quantinuum, QCI(QUBT), and Rigetti(RGTI).

Quantum computing is not central to Microsoft's business, at least for now. It is nevertheless a central actor of the sector and might make for a safer stock pick over directly acquiring shares of its quantum computing partners that are publicly traded, like QCI or Rigetti.

Google is very active in quantum computing, mostly through its Google Quantum AI lab and Quantum AI campus in Santa Barbara.

Google's quantum computer made history in 2019 when Google claimed to have achieved quantum supremacy with its Sycamore machine, performing a calculation in 200 seconds that would have taken a conventional supercomputer 10,000 years.

But maybe the greatest contribution of Google will be in software, an activity where it has a much better track record than hardware (search, GSuit, Android, etc.). Already, Google's Quantum AI makes available a suite of software designed to assist scientists in developing quantum algorithms.

Google might likely be one of the companies setting the standards of quantum computing software & programming, giving a privileged place to direct where the field will evolve in the future.

Quantinuum is the result of the merger of Honeywell Quantum Solutions and Cambridge Quantum (and, as mentioned, a partner of Microsoft quantum cloud computing).

Quantinuum seems, for now, to focus on segments less explored by other quantum computing systems, notably financial and supply chain-related analyses, through its Quantum Monte Carlo Integration (QMCI) engine, launched in September 2023.

QMCI applies to problems that have no analytic solution, such as pricing financial derivatives or simulating the results of high-energy particle physics experiments, and promises computational advances across business, energy, supply chain logistics, and other sectors.

Like for Microsoft, quantum computing is not the central part of Honeywell's business, more centered around products in aerospace, automation, and specialty chemicals & materials.

However, considering every single one of these business segments could benefit from quantum computing, it is not hard to see the business case for Honeywell to get involved.

So this makes Honeywell both a provider of quantum computing services and one of the companies that could benefit from the application of quantum computers to real-life business cases, something the integration of Quantinuum into the group should help foster at a quicker pace than its industrial competitors.

Intel is a major chip producer and seems to target to leverage this strength into the quantum computing arena.

It recently released Tunnel Falls, the most advanced silicon spin qubit chip. What is remarkable is that it is not a prototype but a chip built at scale, with a 95% yield rate across the wafer and voltage uniformity. This opens the way to mass production of quantum computing chips, something for now elusive in a nascent and quickly changing industry.

Faithful to its roots, Intel is also developing the software to utilize its chips, with the release of the Intel Quantum SDK. This provides the guideline for programmers to develop software for quantum computing compatible with Intel quantum chip design, which has historically been a very strong & profitable business moat for Intel's conventional chip business.

The arrival of scalable quantum chip manufacturing could be as revolutionary for the industry as any other more technical scientific breakthrough, bringing down costs, and setting common programming standards and chip architectures.

Intel is a company that knows from experience how strong of a force this can be in the computing industry, still riding on the tail of its innovations and associated patents from the 1960s onward.

The quantum computing sector is still very young. It has so far been mostly taken over by large tech corporations with deep enough pockets to finance billions of dollars into this sort of fundamental research.

However, many other smaller companies are also active in the field, some partnering with said giants to deploy their technology.

It can be a rather difficult task for non-specialist investors to understand the intricacy of the different quantum computing technologies, even more guessing which will be commercially successful.

So, while direct investment in small quantum computing startups is an option, another is to rely on an ETF to get exposure to the sector while diversifying at a lower cost.

The Defiance Quantum ETF contains 69 different stocks related to quantum computing in its holdings, including quantum computer & chip developers, as well as suppliers of cooling systems, lasers, software, and other technology used in quantum computers or quantum chip production.

In this quickly evolving field, most investors, even those familiar with the semiconductor industry, will probably benefit from a degree of diversification. So this can be achieved either by betting on individual tech giants making the right partnership choices or with a wide array of stocks, something often more efficiently achieved through a dedicated ETF.

Continued here:
The Current State of Quantum Computing - Securities.io

Related Posts

Comments are closed.