Archive for the ‘Quantum Computer’ Category

PsiQuantum’s Path to 1 Million Qubits by the Middle of the Decade – HPCwire

PsiQuantum, founded in 2016 by four researchers with roots at Bristol University, Stanford University, and York University, is one of a few quantum computing startups thats kept a moderately low PR profile. (Thats if you disregard the roughly $700 million in funding it has attracted.) The main reason is PsiQuantum has eschewed the clamorous public chase for NISQ (near-term intermediate scale quantum) computers and set out to develop a million-qubit system the company says will deliver big gains on big problems as soon as it arrives.

When will that be?

PsiQuantum says it will have all the manufacturing processes in place by the middle of the decade and its working closely with GlobalFoundries (GF) to turn its vision into reality. The generous size of its funding suggests many think it will succeed. PsiQuantum is betting on a photonics-based approach called fusion-based quantum computing (paper) that relies mostly on well-understood optical technology but requires extremely precise manufacturing tolerances to scale up. It also relies on managing individual photons, something that has proven difficult for others.

Heres the companys basic contention:

Success in quantum computing will require large, fault-tolerant systems and the current preoccupation with NISQ computers is an interesting but ultimately mistaken path. The most effective and fastest route to practical quantum computing will require leveraging (and innovating) existing semiconductor manufacturing processes and networking thousands of quantum chips together to reach the million-qubit system threshold thats widely regarded as necessary to run game-changing applications in chemistry, banking, and other sectors.

Its not that incrementalism is bad. In fact, its necessary. But its not well served when focused on delivering NISQ systems argues Peter Shadbolt, one of PsiQuantum founders and the current chief scientific officer.

Conventional supercomputers are already really good. Youve got to do some kind of step change, you cant increment your way [forward], and especially you cant increment with five qubits, 10 qubits, 20 qubits, 50 qubits to a million. That is not a good strategy. But its also not true to say that were planning to leap from zero to a million, said Shadbolt. We have a whole chain of incrementally larger and larger systems that were building along the way. Those allow us to validate the control electronics, the systems integration, the cryogenics, the networking, etc. But were not spending time and energy, trying to dress those up as something that theyre not. Were not having to take those things and try to desperately extract computational value from something that doesnt have any computational value. Were able to use those intermediate systems for our own learnings and for our own development.

Thats a much different approach from the majority of quantum computing hopefuls. Shadbolt suggests the broad message about the need to push beyond NISQ dogma is starting to take hold.

There is a change that is happening now, which is that people are starting to program for error-corrected quantum computers, as opposed to programming for NISQ computers. Thats a welcome change and thats happening across the whole space. If youre programming for NISQ computers, you very rapidly get deeply entangled if youll forgive the pun with the hardware. You start looking under the hood, and you start trying to find shortcuts to deal with the fact that you have so few gates at your disposal. So, programming NISQ computers is a fascinating, intellectually stimulating activity, Ive done it myself, but it rapidly becomes sort of siloed and you have to pick a winner, said Shadbolt.

With fault tolerance, once you start to accept that youre going to need error correction, then you can start programming in a fault-tolerant gate set which is hardware agnostic, and its much more straightforward to deal with. There are also some surprising characteristics, which mean that the optimizations that you make to algorithms in a fault-tolerant regime are in many cases, the diametric opposite of the optimizations that you would make in the NISQ regime. It really takes a different approach but its very welcome that the whole industry is moving in that direction and spending less time on these kinds of myopic, narrow efforts, he said.

That sounds a bit harsh. PsiQuantum is no doubt benefitting from the manifold efforts by the young quantum computing ecosystem to tout advances and build traction by promoting NISQ use cases. Theres an old business axiom that says a little hype is often a necessary lubricant to accelerate development of young industries; quantum computing certainly has its share. A bigger question is will PsiQuantum beat rivals to the end-game? IBM has laid out a detailed roadmap and said 2023 is when it will start delivering quantum advantage, using a 1000-qubit system, with plans for eventual million-qubit systems. Intel has trumpeted its CMOS strength to scale up manufacturing its quantum dot qubits. D-Wave has been selling its quantum annealing systems to commercial and government customers for years.

Its really not yet clear which of the qubit technologies semiconductor-based superconducting, trapped ions, neutral atoms, photonics, or something else will prevail and for which applications. Whats not ambiguous is PsiQuantums Go Big or Go Home strategy. Its photonics approach, argues the company, has distinct advantages in manufacturability and scalability, operating environment (less frigid), ease of networking, and error correction. Shadbolt recently talked with HPCwire about the companys approach, technology and progress.

What is fusion-based quantum computing?

Broadly, PsiQuantum uses a form of linear optical quantum computing in which individual photons are used as qubits. Over the past year and a half, the previously stealthy PsiQuantum has issued several papers describing the approach while keeping many details close to the vest (papers listed at end of article). The computation flow is to generate single photons and entangle them. PsiQuantum uses dual rail entangling/encoding for photons. The entangled photons are the qubits and are grouped into what PsiQuantum calls resource states, a group of qubits if you will. Fusion measurements (more below) act as gates. Shadbolt says the operations can be mapped to a standard gate-set to achieve universal, error-corrected, quantum computing.

On-chip components carry out the process. It all sounds quite exotic, in part because it differs from more-widely used matter-based qubit technologies. The figure below taken from a PsiQuantum paper Fusion-based quantum computation issued about a year ago roughly describes the process.

Digging into the details is best served by reading the papers and the company has archived videos exploring its approach on its website. The video below is a good brief summation by Mercedes Gimeno-Segovia, vice president of quantum architecture at PsiQuantum.

Shadbolt also briefly described fusion-based quantum computation (FBQC).

Once youve got single photons, you need to build what we refer to as seed states. Those are pretty small entangled states and can be constructed again using linear optics. So, you take some single photons and send them into an interferometer and together with single photon detection, you can probabilistically generate small entangled states. You can then multiplex those again and basically the task is to get as fast as possible to a large enough, complex enough, appropriately structured, resource state which is ready to then be acted upon by a fusion network. Thats it. You want to kill the photon as fast as possible. You dont want photons living for a long time if you can avoid it. Thats pretty much it, said Shadbolt.

The fusion operators are the smallest simplest piece of the machine. The multiplex, single-photon sources are the biggest, most expensive piece. Everything in the middle is kind of the secret sauce of our architecture, some of that weve put out in that paper and you can see kind of how that works, he said. (At the risk of overkill, another brief description of the system from PsiQuantum is presented at the end of the article.)

One important FBQC advantage, says PsiQuantum, is that the shallow depth of optical circuits make error correction easier. The small entangled states fueling the computation are referred to as resource states. Importantly, their size is independent of code distance used or the computation being performed. This allows them to be generated by a constant number of operations. Since the resource states will be immediately measured after they are created, the total depth of operations is also constant. As a result, errors in the resource states are bounded, which is important for fault-tolerance.

Some of the differences between the PsiQuantums FBQC design and the more familiar MBQC (measurement-based quantum computing) paradigm are shown below.

Another advantage is the operating environment.

Nothing about photons themselves requires cryogenic operation. You can do very high fidelity manipulation and generation of qubits at room temperature, and in fact, you can even detect single photons at room temperature just fine. The efficiency of room temperature single photon detectors, is not good enough for fault tolerance. These room temperature detectors are based on pretty complex semiconductor devices, avalanche photodiodes, and theres no physical reason why you couldnt push those to the necessary efficiency, but it looks really difficult [and] people have been trying for a very long time, said Shadbolt

We use a superconducting single-photon detector, which can achieve the necessary efficiencies without a ton of development. Its worth noting those detectors run in the ballpark of 4 Kelvin. So liquid helium temperature, which is still very cold, but its nowhere near as cold as milli-Kelvin temperatures required for superconducting qubits or some of the competing technologies, said Shadbolt.

This has important implications for control circuit placement as well as for reduced power needed to generate the 4-degree Kelvin environment.

Theres a lot to absorb here and its best done directly from the papers. PsiQuantum, like many other quantum start-ups, was founded by researchers who were already digging into the quantum computing space and theyve shown that PsiQuantums FBQC flavor of linear optical quantum computing will work. While at Bristol, Shadbolt was involved in the first demonstration of running a Variational Quantum Eigensolver (VQE) on a photonic chip.

The biggest challenges for PsiQuantum, he suggests, are developing manufacturing techniques and system architecture around well-known optical technology. The company argues having a Tier-1 fab partner such as GlobalFoundries is decisive.

You can go into infinite detail on the architecture and how all the bits and pieces go together. But the point of optical quantum computing is that the network of components is pretty complicated all sorts of modules and structures and multiplexing strategies, and resource state generation schemes and interferometers, and so on but theyre all just made out of beam splitters, and switches, and single photon sources and detectors. Its kind of like in a conventional CPU, you can go in with a microscope and examine the structure of the cache and the ALU and whatever, but underneath its all just transistors. Its the same kind of story here. The limiting factor in our development is the semiconductor process enablement. The thesis has always been that if you tried to build a quantum computer anywhere other than a high-volume semiconductor manufacturing line, your quantum computer isnt going to work, he said.

Any quantum computer needs millions of qubits. Millions of qubits dont fit on a single chip. So youre talking about heaps of chips, probably billions of components realistically, and they all need to work and they all need to work better than the state of the art. That brings us to the progress, which is, again, rearranging those various components into ever more efficient and complex networks in pretty close analogy with CPU architecture. Its a very key part of our IP, but its not rate limiting and its not terribly expensive to change the network of components on the chip once weve got the manufacturing process. Were continuously moving the needle on that architecture development and weve improved these architectures in terms of their tolerance to loss by more than 150x, [actually] well beyond that. Weve reduced the size of the machine, purely through architectural improvements by many, many orders of magnitude.

The big, expensive, slow pieces of the development are in being able to build high quality components at GlobalFoundries in New York. What weve already done there is to put single photon sources and superconducting nanowire, single photon detectors into that manufacturing process engine. We can build wafers, 300-millimeter wafers, with tens of thousands of components on the wafer, including a full silicon photonics PDK (process design kit), and also a very high performing single photon detector. Thats real progress that brings us closer to being able to build a quantum computer, because that lets us build millions to billions of components.

Shadbolt says real systems will quickly follow development of the manufacturing process. PsiQuantum, like everyone in the quantum computing community, is collaborating closely with potential users. Roughly a week ago, it issued a joint paper with Mercedes-Benz discussing quantum computer simulation of Li-ion chemistry. If the PsiQuantum-GlobalFoundries process is ready around 2025, can a million-qubit system (100 logical qubits) be far behind?

Shadbolt would only say that things will happen quickly once the process has been fully developed. He noted there are three ways to make money with a quantum computer: sell machines, sell time, and sell solutions that come from that machine. I think we were exploring all of the above, he said.

Our customers, which is a growing list at this point pharmaceutical companies, car companies, materials companies, big banks are coming to us to understand what a quantum computer can do for them. To understand that, what we are doing, principally, is fault-tolerant resource counting, said Shadbolt. So that means were taking the algorithm or taking the problem the customer has, working with their technical teams to look under the hood, and understand the technical requirements of solving that problem. We are turning that into the quantum algorithms and sub routines that are appropriate. Were compiling that for the fault-tolerant gate set that will run on top of that fusion network, which by the way is a completely vanilla textbook fault-tolerant gate set.

Stay tuned.

PsiQuantum Papers

Fusion-based quantum computation, https://arxiv.org/abs/2101.09310

Creation of Entangled Photonic States Using Linear Optics, https://arxiv.org/abs/2106.13825

Interleaving: Modular architectures for fault-tolerant photonic quantum computing, https://arxiv.org/abs/2103.08612

Description of PsiQuantums Fusion-Based System from the Interleaving Paper

Useful fault-tolerant quantum computers require very large numbers of physical qubits. Quantum computers are often designed as arrays of static qubits executing gates and measurements. Photonic qubits require a different approach. In photonic fusion-based quantum computing (FBQC), the main hardware components are resource-state generators (RSGs) and fusion devices connected via waveguides and switches. RSGs produce small entangled states of a few photonic qubits, whereas fusion devices perform entangling measurements between different resource states, thereby executing computations. In addition, low-loss photonic delays such as optical fiber can be used as fixed-time quantum memories simultaneously storing thousands of photonic qubits.

Here, we present a modular architecture for FBQC in which these components are combined to form interleaving modules consisting of one RSG with its associated fusion devices and a few fiber delays. Exploiting the multiplicative power of delays, each module can add thousands of physical qubits to the computational Hilbert space. Networks of modules are universal fault-tolerant quantum computers, which we demonstrate using surface codes and lattice surgery as a guiding example. Our numerical analysis shows that in a network of modules containing 1-km-long fiber delays, each RSG can generate four logical distance-35 surface-code qubits while tolerating photon loss rates above 2% in addition to the fiber-delay loss. We illustrate how the combination of interleaving with further uses of non-local fiber connections can reduce the cost of logical operations and facilitate the implementation of unconventional geometries such as periodic boundaries or stellated surface codes. Interleaving applies beyond purely optical architectures, and can also turn many small disconnected matter-qubit devices with transduction to photons into a large-scale quantum computer.

Slides/Figures from various PsiQuantum papers and public presentations

Continued here:
PsiQuantum's Path to 1 Million Qubits by the Middle of the Decade - HPCwire

Preparation Is Key: How America Can Get Ahead of Q-Day – The National Interest Online

Referring to Q-Day, the day when quantum computers are powerful enough to break our current encryption, Arthur Herman, senior fellow at the Hudson Institute, once wrote the following: Q-Day is the term some experts use to describe when large-scale quantum computers are able to factorize the large prime numbers that underlie our public encryption systems... Ironically, the phrase Q-Day was also used for the testing of the first atom bomb in 1945.

Today, most of the world s digital communications rely on standardized encryption to protect against classical (the computers we currently use today) computing attacks. This encryption, sometimes referred to as public-key encryption, PKI (Public Key Infrastructure), RSA (Rivest Shamir Adleman) or ECC (Elliptic Curve Cryptography), is based on a single transaction of factoring a large number. This mathematical equation is all that stands between our data and our adversaries. For example, the numbers three and five multiply into the number fifteen and thus they are the factors. Factoring refers to being able to find two numbers that multiply into a much larger number. With large numbers it is a difficult and largely impossible task for classical computers to figure out; for example, 14,378,234 has factors of 806 17839. So far, all public-key encryption schemes have done an adequate job of protecting our data and communications, and we have also been able to increase the size of the numbers to be factored (also known as the key sizes) to stay ahead of the curve.

However, quantum computers are good at factoring large numbers. Quantum computers operate by using subatomic properties such as superposition, entanglement, and interference which enable a quantum computer to scale very rapidly, at an exponential rate. As a result, quantum computers have the power to crack encryption and solve the factoring problem. The fact is that we know mathematically from Peter Shors algorithm that quantum computers will absolutely break our current encryption unless we upgrade. What does that mean for the United States?

How Can Q-Day Happen?

When a sufficiently powerful quantum computer comes online (these can be referred to as cryptographically relevant quantum computers or CRQCs), whoever has access to such a computer will be able to decrypt any previously encrypted data. As an example, if an attacker has stolen and locally stored encrypted military secrets on their local serversa practice referred to as steal now, decrypt later (or SNDL) that we know is happening todayand these secrets are protected only by public-key encryption using the factoring that we discussed above, they will be decrypted by a CRQC. That attacker will now be able to decrypt all of that stored data and make use of it for whatever purposes they choose. Additionally, the same attacker could use that CRQC to attack communications that are currently occurring over the internet via the airwaves. The same powerful CRQC could be used to eavesdrop or steal data from radio transmissions, fiber transmissions, or any other communications that are using PKI. So, if that attacker has listening devices in a variety of geographic areas or regions, they could effectively unlock any data in transit moving over those communications lines.

When Will Q-Day Happen?

No one knows the exact date when Q-Day will happen. Some are predicting it will be around 2030, some say it will never happen, and others are estimating that we could have a CRQC in two to three years. We know that nation-states are investing billions of dollars in quantum computing, and it is estimated that China is spending upwards of $15 billion to build a quantum computer just to crack PKI. This effort utilizing over 1,000 programmers and scientists is formidable and should not be underestimated.

Our own government has been concerned and is now acting to mitigate the threat and consequences of a CRQC. The National Institute of Standards and Technology (NIST) has been studying and finalizing quantum-resistant algorithms. Recently the White House issued a Memorandum on Improving the Cybersecurity of National Security, Department of Defense, and Intelligence Community Systems which mandates that Within 180 days of the date of this memorandum (Jan. 19, 2022), agencies shall identify any instances of encryption not in compliance with NSA-approved Quantum Resistant Algorithms... Additionally, the United States Innovation and Competition Act of 2021 allocates over $12 billion, and contains specific language and funding for quantum cryptography and post-quantum classical cryptography.

Do not be fooled by what you see in the news or in public-facing articles. You can be sure that a nation-state attacker is not going to announce that they have a CRQC capable of dissolving PKI. Their incentive is to stay underground, harvesting as much data as they can before anyone notices.

Possible Q-Day Scenarios

So, what could happen if a U.S. adversary fully utilized a powerful quantum computer? We could see massive amounts of data being stolen and decrypted, financial system collapses, energy grid hacks, and even control over major military systems. The fact is that we are all leaving ever-increasing digital footprints and every company and government agency on this planet utilizes increasing amounts of digital capabilities and assets. Everything we do has a digital trace, and all data is now flowing and openly accessible though current standard encryption. Imagine if all that data was available to whoever had access to a CRQC? The power they would have would be so great that it is hard to imagine the damage that would be done and the global power that would be held.

Arthur Herman (mentioned above) conducted two formidable studies on what a single, successful quantum computing attack would do to both our banking systems and a major cryptocurrency. A single attack on the banking system by a quantum computer would take down Fedwire and cause $2 trillion of damage in a very short period of time. A similar attack on a cryptocurrency like bitcoin would cause a 90 percent drop in price and would start a three-year recession in the United States. Both studies were backed up by econometric models using over 18,000 data points to predict these cascading failures.

Another disastrous effect could be that an attacker with a CRQC could take control of any systems that rely on standard PKI. So, by hacking communications, they would be able to disrupt data flows so that the attacker could take control of a device, crashing it into the groundor even using it against an enemy. Think of the number of autonomous vehicles that we are using both from a civilian and military standpoint. Any autonomous devices such as passenger cars, military drones, ships, planes, and robots could be hacked by a CRQC and shut down or controlled to perform activities not originally intended by the current users or owners.

In their fictional book 2034: A Novel of the Next World War, Admiral James Stavridis and Elliot Ackerman portray a scenario where China can hack into U.S. military systems and shut down the global positioning system, weapon systems, and communications. This renders the U.S. military helpless and Chinese submarines simply destroy the U.S. Navys entire fleet in the South China Sea with uncontested torpedoes. In the book, all the U.S. militarys assets cannot communicate, and we are sitting ducks allowing China to create some significant destruction in the mainland United States. While not specifically mentioning a CRQC as the tool of destruction, it is completely within reason to think that a quantum computer powerful enough to crack all encryption and communications would be able to create this scenario.

Preparation Starts Now

So, with the above near-term threat, what can we do now to protect ourselves against such disasters?

First, I recommend that leadership, whether government, commercial or other, begin to look at existing cryptographic systems to understand where digital vulnerabilities exist. In many cases with large enterprises and government agencies, the cryptographic upgrade process from PKI to post-quantum cryptography (PQC) to protect systems could take years. PQC refers to the implementation of software-based cryptography and systems that are resistant to quantum attacks. Even with CRQCs, both communications and data would be resilient to quantum attacks since they use much more complex algorithms and systems than our standard PKI, which uses factoring. This move from PKI to PQC will be the largest upgrade cycle in computer history, and all public-key encryption needs to change to provide a completely quantum resilient ecosystem. Data in transit and at rest, and all devices will need to upgrade to PQC, which will reduce or mitigate the ability for quantum computers to crack encryption. Enterprise and government agencies can start now by testing PQC to understand how it works in their environments. Companies today provide PQC that can be tested in an enterprise or via the cloud. It is vital that all company leaders start the process of understanding how to move to a PQC worldthe future of U.S. national security depends on it.

Skip Sanzeri has been an entrepreneur since 1986 and currently is the Founder, Board Chair, CRO and COO at QuSecure, a top post-quantum cyber-security company using post-quantum cryptography and quantum key distribution to help secure the US military, government and commercial businesses. Founder and Board Chair Quantum Thought a leading venture studio focused on quantum computing applications and is also the Founder and Partner at Multiverse Capital. Skip is a co-author of Quantum Design Sprint: A Workbook for Designing a Quantum Computing Application and Disruptive Business Model.

Read more:
Preparation Is Key: How America Can Get Ahead of Q-Day - The National Interest Online

Global Quantum Computing Market Analysis Report 2022-2027: Assessment of Technology, Companies/Organizations, R&D Efforts, and Potential Solutions -…

DUBLIN--(BUSINESS WIRE)--The "Quantum Computing Market by Technology, Infrastructure, Services, and Industry Verticals 2022 - 2027" report has been added to ResearchAndMarkets.com's offering.

This report assesses the technology, companies/organizations, R&D efforts, and potential solutions facilitated by quantum computing.

The report provides global and regional forecasts as well as the outlook for quantum computing impact on infrastructure including hardware, software, applications, and services from 2022 to 2027. This includes the quantum computing market across major industry verticals.

Quantum Computing Industry Impact

The implications for data processing, communications, digital commerce and security, and the internet as a whole cannot be overstated as quantum computing is poised to radically transform the ICT sector. In addition, quantum computing will disrupt entire industries ranging from government and defense to logistics and manufacturing. No industry vertical will be immune to the potential impact of quantum computing. Every industry must pay great attention to technology developments, implementation, integration, and market impacts.

Quantum Computing Technology Development

While there is great promise for quantum computing, it remains largely in the research and development (R&D) stage as companies, universities, and research organizations seek to solve some of the practical problems for commercialization such as how to keep a qubit stable. The stability problem is due to molecules always being in motion, even if that motion is merely a small vibration. When qubits are disturbed, a condition referred to as decoherence occurs, rendering computing results unpredictable or even useless. One of the potential solutions is to use super-cooling methods such as cryogenics.

Some say there is a need to reach absolute zero (the temperature at which all molecular motion ceases), but that is a theoretical temperature that is practically impossible to reach and maintain, requiring enormous amounts of energy. There are some room-temperature quantum computers in R&D using photonic qubits, but nothing is yet scalable. Some experts say that if the qubit energy level is high enough, cryogenic type cooling is not a requirement.

Alternatives include ion trap quantum computing and other methods to achieve very cold super-cooled small-scale demonstration level computing platforms. There are additional issues involved with implementing and operating quantum computing. In terms of maintenance, quantum systems must be kept at subzero temperatures to keep the qubits stable, which creates trouble for people working with them and expensive, energy-consuming equipment to support.

Once these issues are overcome, we anticipate that quantum computing will become more mainstream for solving specific types of problems. However, there will remain general-purpose computing problems that must be solved with classical computing. In fact, we anticipate development of solutions that involve quantum and classical CPUs on the same computing platform, which will be capable of solving combined general purpose and use case-specific computation problems.

These next-generation computing systems will provide the best of both worlds, which will be high-speed, general-purpose computing combined with use case-specific ultra-performance for certain tasks that will remain outside the range of binary computation for the foreseeable future.

Select Report Findings:

Key Topics Covered:

1.0 Executive Summary

2.0 Introduction

2.1 Understanding Quantum Computing

2.2 Quantum Computer Types

2.2.1 Quantum Annealer

2.2.2 Analog Quantum

2.2.3 Universal Quantum

2.3 Quantum Computing vs. Classical Computing

2.3.1 Will Quantum replace Classical Computing?

2.3.2 Physical Qubits vs. Logical Qubits

2.4 Quantum Computing Development Timeline

2.5 Quantum Computing Market Factors

2.6 Quantum Computing Development Progress

2.6.1 Increasing the Number of Qubits

2.6.2 Developing New Types of Qubits

2.7 Quantum Computing Patent Analysis

2.8 Quantum Computing Regulatory Analysis

2.9 Quantum Computing Disruption and Company Readiness

3.0 Technology and Market Analysis

3.1 Quantum Computing State of the Industry

3.2 Quantum Computing Technology Stack

3.3 Quantum Computing and Artificial Intelligence

3.4 Quantum Neurons

3.5 Quantum Computing and Big Data

3.6 Linear Optical Quantum Computing

3.7 Quantum Computing Business Model

3.8 Quantum Software Platform

3.9 Application Areas

3.10 Emerging Revenue Sectors

3.11 Quantum Computing Investment Analysis

3.12 Quantum Computing Initiatives by Country

4.0 Quantum Computing Drivers and Challenges

4.1 Quantum Computing Market Dynamics

4.2 Quantum Computing Market Drivers

4.2.1 Growing Adoption in Aerospace and Defense Sectors

4.2.2 Growing investment of Governments

4.2.3 Emergence of Advance Applications

4.3 Quantum Computing Market Challenges

5.0 Quantum Computing Use Cases

5.1 Quantum Computing in Pharmaceuticals

5.2 Applying Quantum Technology to Financial Problems

5.3 Accelerate Autonomous Vehicles with Quantum AI

5.4 Car Manufacturers using Quantum Computing

5.5 Accelerating Advanced Computing for NASA Missions

6.0 Quantum Computing Value Chain Analysis

6.1 Quantum Computing Value Chain Structure

6.2 Quantum Computing Competitive Analysis

6.2.1 Leading Vendor Efforts

6.2.2 Start-up Companies

6.2.3 Government Initiatives

6.2.4 University Initiatives

6.2.5 Venture Capital Investments

6.3 Large Scale Computing Systems

7.0 Company Analysis

7.1 D-Wave Systems Inc.

7.2 Google Inc.

7.3 Microsoft Corporation

7.4 IBM Corporation

7.5 Intel Corporation

7.6 Nokia Corporation

7.7 Toshiba Corporation

7.8 Raytheon Company

7.9 Other Companies

7.9.1 1QB Information Technologies Inc.

7.9.2 Cambridge Quantum Computing Ltd.

7.9.3 QC Ware Corp.

7.9.4 MagiQ Technologies Inc.

7.9.5 Rigetti Computing

7.9.6 Anyon Systems Inc.

7.9.7 Quantum Circuits Inc.

7.9.8 Hewlett Packard Enterprise

7.9.9 Fujitsu Ltd.

7.9.10 NEC Corporation

7.9.11 SK Telecom

7.9.12 Lockheed Martin Corporation

7.9.13 NTT Docomo Inc.

7.9.14 Alibaba Group Holding Limited

7.9.15 Booz Allen Hamilton Inc.

7.9.16 Airbus Group

7.9.17 Amgen Inc.

7.9.18 Biogen Inc.

7.9.19 BT Group

7.9.20 Mitsubishi Electric Corp.

7.9.21 Volkswagen AG

7.9.22 KPN

7.10 Ecosystem Contributors

7.10.1 Agilent Technologies

More:
Global Quantum Computing Market Analysis Report 2022-2027: Assessment of Technology, Companies/Organizations, R&D Efforts, and Potential Solutions -...

Members of Netherland’s Delft Quantum Ecosystem Receive 550000 ($594K USD) in Two R&D Grants – Quantum Computing Report

Members of Netherlands Delft Quantum Ecosystem Receive 550,000 ($594K USD) in Two R&D Grants

The first grant was for an amount of 350,000 and was provided by the Province of South Holland. It was given to a research collaboration consisting of collaboration between Orange Quantum Systems, Delft Circuits, and Leiden Cryogenics which are researching the practical application of quantum technology. The second grant was in the amount of 200,000 and was provided to the ImpaQT initiative by Metropolitan Region Rotterdam The Hagueand the Province of South Holland. The ImpaQT initiative is working to provide a value chain consisting of componentsd and related services for organizations wishing to build their own quantum computer using components provides by the members of the ImpaQT initiative. Members of the ImpaQT consortium include QuantWare,Demcon,Qu&Co,Orange Quantum Systems,Qblox,andDelft Circuits.Additional information about these grants and the associated programs can be seen in a news release provided by Quantum Delft available here.

April 25, 2022

This site uses Akismet to reduce spam. Learn how your comment data is processed.

See original here:
Members of Netherland's Delft Quantum Ecosystem Receive 550000 ($594K USD) in Two R&D Grants - Quantum Computing Report

Build your own quantum computer with Google’s latest ‘simulator’ – Engadget

World Quantum Day was apparently yesterday, and Google feted the occasion with the launch of The Qubit Game, as spotted by 9to5Google. Created in partnership with Doublespeak games, it's a playful journey to building a quantum computer, one qubit at a time," Google said. It also hopes the game, and World Quantum Day, will help generate some interest in the field.

The game resolves around Qubits, the basic building block of a quantum computer. It's pretty straightforward (you won't need to learn any quantum entanglement math or physics) with the goal of increasing the number of Qubits while keeping them cool. The more Qubits you have, the more difficult it gets. Eventually, you'll "discover new upgrades, complete big research projects and hopefully become a little more curious about how we're building quantum computers," wrote Google Quantum head of education Abe Asfaw.

The goal is to draw attention to quantum computing, because it seems there's a dearth of people working in the field. To that end, Google is bringing the game to the classroom, hoping to encourage educators to talk about the subject and expand access to quantum computing research.

"We need more students pursuing careers building or using quantum computers, and understanding what it would be like to be a quantum scientist or engineer," wrote Asfaw. "For me, thats what World Quantum Day is all about: showing everyone what quantum computing really is and how they can get involved."

All products recommended by Engadget are selected by our editorial team, independent of our parent company. Some of our stories include affiliate links. If you buy something through one of these links, we may earn an affiliate commission.

Read the original post:
Build your own quantum computer with Google's latest 'simulator' - Engadget