Archive for the ‘Quantum Computing’ Category

Meet the NSA spies shaping the future – MIT Technology Review

Future history

The NSAs Research Directorate is descended from the Black Chamber, the first group of civilian codebreakers in the United States who were tasked with spying on cutting-edge technology, like the telegraph. Existing only from 1919 to 1929, the group decoded over 10,000 messages from a dozen nations, according to James Bamfords 2001 book Body of Secrets: Anatomy of the Ultra-Secret National Security Agency. In addition to groundbreaking cryptanalytic work, the group succeeded by securing surveillance help from American cable companies like Western Union that could supply the newly minted US spies with sensitive communications to examine.

The Black Chamber was shut down amid scandal when US Secretary of State Henry Stimson found out the group was spying on American allies as well as foes. The incident foreshadowed the 1975 Church Committee, which investigated surveillance abuses by American intelligence agencies, and the 2013 Snowden leaks, which exposed vast electronic surveillance capabilities that triggered a global reckoning.

Just eight months after the Black Chamber was shuttered, the US, faced with the prospect of crippled spying capabilities in the increasingly unstable world of the 1930s, reformed the effort under the Armys Signals Intelligence Service. One of just three people working with the Black Chambers old records, one of the founders of the SIS, which Bamford reports was kept a secret from the State Department, was the mathematician Solomon Kullback.

Kullback was instrumental in breaking both Japanese and German codes before and during World War II, and he later directed the research and development arm of the newly formed National Security Agency. Within a year, that evolved into the directorate as we know it today: a distinct space for research that is not disrupted by the daily work of the agency.

Its important to have a research organization, even in a mission-driven organization, to be thinking beyond a crisis, says Herrera, though he adds that the directorate does dedicate some of its work to the crisis of the day. It runs a program called scientists on call, which allows NSA mission analysts facing technical challenges while interrogating information to ask for help via email, giving them access to hundreds of scientists.

But the lions share of the directorates work is envisioning the technologies that are generations ahead of what we have today. It operates almost like a small, elite technical college, organized around five academic departmentsmath, physics, cyber, computer science, and electrical engineeringeach staffed with 100 to 200 people.

The cybersecurity department defends the federal governments national security and the countrys military-industrial base. This is the highest-profile department, and deliberately so. Over the last five years, the previously shadowy NSA has become more vocal and active in cybersecurity. It has launched public advisories and research projects that would once have been anathema for an organization whose existence wasnt even acknowledged until 20 years after its founding.

Now the products of NSA research, like Ghidra, a free, sophisticated reverse engineering tool that helps in the technical dissection of hacking tools, as well as other software, are popular, trusted, and in use around the world. They serve as powerful cybersecurity tools, a recruiting pitch, and a public relations play all wrapped into one.

The physics department, which Herrera once directed, runs dozens of laboratories that conduct most of the work on quantum information sciences, but it has a much wider remit than that. As physical limits in the ability to squeeze more transistors into chips threaten to slow and halt 60 years of predictably rapid computing growth, its physicists are exploring new materials and novel computing architectures to drive the next generation of computing into a less predictable future, exactly the kind of task the directorate was given when it first came into existence.

Meanwhile, the electrical engineering department has been looking closely at the physics and engineering of telecommunications networks since the internet first arose. As well as the issues around 5G, it also tackles every facet of the digital world, from undersea cables to satellite communications.

Some prospects on the horizon dont fit neatly into any particular box. The computer science departments work on artificial intelligence and machine learning, for example, cuts across cybersecurity missions and data analysis work with the mathematicians.

Herrera repeatedly raises the prospect of the directorate needing to develop greater capabilities in and understanding of rapidly advancing fields like synthetic biology. The NSA is hardly alone in this: Chinese military leaders have called biotech a priority for national defense.

Much of the competition in the world now is not military, Herrera says. Military competition is accelerating, but there is also dissemination of other technologies, like synthetic biologies, that are frankly alarming. The role of research is to help the NSA understand what the impact of those technologies will be. How much we actually get involved, I dont know, but these are areas we have to keep an eye on.

Original post:
Meet the NSA spies shaping the future - MIT Technology Review

The Worldwide Next Generation Computing Industry is Expected to Reach $7 Billion by 2027 – PRNewswire

DUBLIN, Feb. 1, 2022 /PRNewswire/ -- The "Next Generation Computing Market: Bio-Computing, Brain-Computer Interfaces, High Performance Computing, Nanocomputing, Neuromorphic Computing, Serverless Computing, Swarm Computing, and Quantum Computing 2022 - 2027" report has been added to ResearchAndMarkets.com's offering.

This next generation computing market report evaluates next generation computing technologies, use cases, and applications. Market readiness factors are considered along with the impact of different computational methods upon other emerging technologies.

The report provides analysis of leading-edge developments such as computer integration with human cognition via bio-computing and brain-computer interfaces. Other pioneering areas covered include leveraging developments in nanotechnology to develop more effective computing models and methods.

The report includes critical analysis of leading vendors and strategies. The report includes next generation computing market sizing for the period of 2022 - 2027.

Select Report Findings:

There are many technologies involved, including distributed computing (swarm computing), computational collaboration (bio-computing), improving performance of existing supercomputers, and completely new computer architectures such as those associated with quantum computing. Each of these approaches has their own advantages and disadvantages. Many of these different computing architectures and methods stand alone in terms of their ability to solve market problems.

Next generation computing technologies covered in this report include:

More than simply an amalgamation of technologies, the next generation computing market is characterized by many different approaches to solve a plethora of computational challenges. Common factors driving the market include the need for ever increasing computation speed and efficiency, reduced energy consumption, miniaturization, evolving architectures and business models.

High-performance Computing

High-performance computing (HPC) solves complex computational problems using supercomputers and parallel computational techniques, processing algorithms and systems. HPC leverages various techniques including computer modeling, simulation, and analysis to solve advanced computational problems and perform research activities while allowing usage of computing resources concurrently.

Quantum Computing

The commercial introduction of quantum computing is anticipated to both solve and create new problems as previously unsolvable problems will be solved. This multiplicity of developments with next generation computing makes it difficult for the enterprise or government user to make decisions about infrastructure, software, and services.

Biocomputing

Biocomputing refers to the construction and use of computers using biologically derived molecules including DNA and proteins to perform computational calculations such as storing, retrieving and processing data. The computing system functions more like a living organism or contains biological components.

Neuromorphic Computing

Neuromorphic computing refers to the implementation of neural systems such as perception, motor control, and multisensory integration for very large-scale integration systems combining analog circuits or digital circuits or mixed mode circuits, and software systems.

Neuromorphic computing leverages the techniques of neuromorphic engineering that takes inspiration from biology, physics, mathematics, computer science, and electronic engineering to develop artificial neural systems including vision systems, head-eye systems, auditory processors, and autonomous robots.

Nanocomputing

Nanocomputing refers to miniature computing devices (within 100 nanometers) that are used to perform critical tasks like representation and manipulation of data. Nanocomputing is expected to bring revolution in the way traditional computing is used in certain key industry verticals, allowing progress in device technology, computer architectures, and IC processing. This technology area will help to substantially progress implantable technologies inserted into the human body, primarily for various healthcare solutions.

Key Topics Covered:

1.0 Executive Summary

2.0 Introduction

3.0 Technology and Application Analysis3.1 High Performance Computing3.1.1 HPC Technology3.1.2 Exascale Computation3.1.2.1 Exascale Supercomputer Development3.1.2.1.1 United States3.1.2.1.2 China3.1.2.1.3 Europe3.1.2.1.4 Japan3.1.2.1.5 India3.1.2.1.6 Taiwan3.1.3 Supercomputers3.1.4 High Performance Technical Computing3.1.5 Market Segmentation Considerations3.1.6 Use Cases and Application Areas3.1.6.1 Computer Aided Engineering3.1.6.2 Government3.1.6.3 Financial Services3.1.6.4 Education and Research3.1.6.5 Manufacturing3.1.6.6 Media and Entertainment3.1.6.7 Electronic Design Automation3.1.6.8 Bio-Sciences and Healthcare3.1.6.9 Energy Management and Utilities3.1.6.10 Earth Science3.1.7 Regulatory Framework3.1.8 Value Chain Analysis3.1.9 AI to Drive HPC Performance and Adoption3.2 Swarm Computing3.2.1 Swarm Computing Technology3.2.1.1 Ant Colony Optimization3.2.1.2 Particle Swarm Optimization3.2.1.3 Stochastic Diffusion Search3.2.2 Swarm Intelligence3.2.3 Swarm Computing Capabilities3.2.4 Value Chain Analysis3.2.5 Regulatory Framework3.3 Neuromorphic Computing3.3.1 Neuromorphic Computing Technology3.3.2 Neuromorphic Semiconductor3.3.2.1 Hardware Neurons3.3.2.2 Implanted Memory3.3.3 Neuromorphic Application3.3.4 Neuromorphic Market Explained3.3.5 Value Chain Analysis3.4 Biocomputing3.4.1 Bioinformatics3.4.2 Computational Biology and Drug Discovery3.4.3 Biodata Mining and Protein Simulations3.4.4 Biocomputing Platform and Services3.4.5 Biocomputing Application3.4.6 Biocomputing Products3.4.7 Value Chain Analysis3.5 Quantum Computing3.5.1 Quantum Simulation, Sensing and Communication3.5.2 Quantum Cryptography3.5.3 Quantum Computing Technology3.5.4 Quantum Programming, Software and SDK3.5.5 Quantum Computing Application3.5.6 Value Chain Analysis3.6 Serverless Computing3.6.1 Serverless Computing Solution3.6.2 Serverless Computing Application3.6.2.1 Event Driven Computing3.6.2.2 Live Video Broadcasting3.6.2.3 Processing IoT Data3.6.2.4 Shared Delivery Dispatch System3.6.2.5 Web Application and Bakends3.6.2.6 Application Scalability3.6.2.7 Sales opportunities and Customer Experience3.6.3 Value Chain Analysis3.7 Brain Computer Interface Technology3.7.1 BCI Overview3.7.2 Invasive vs. Non-Invasive BCI3.7.3 Partially Invasive BCI3.7.4 BCI Applications3.7.5 Silicon Electronics3.7.6 Value Chain Analysis3.8 Nanocomputing3.8.1 Nanotechnology3.8.2 Nanomaterials3.8.3 DNA Nanocomputing3.8.4 Nanocomputing Market3.8.5 Value Chain3.9 Artificial Intelligence and IoT3.10 Edge Computing Network and 5G3.11 Blockchain and Virtualization3.12 Green Computing3.13 Cognitive Computing

4.0 Company Analysis4.1 Vendor Ecosystem4.2 Leading Company4.2.1 ABM Inc.4.2.2 Advanced Brain Monitoring Inc.4.2.3 Advanced Diamond Technologies Inc.4.2.4 Agilent Technologies Inc.4.2.5 Alibaba Group Holding Limited4.2.6 Amazon Web Services Inc.4.2.7 Apium Swarm Robotics4.2.8 Atos SE4.2.9 Advanced Micro Devices Inc.4.2.10 Robert Bosch GmbH4.2.11 Cisco Systems4.2.12 D-Wave Systems Inc.4.2.13 DELL Technologies Inc.4.2.14 Emotiv4.2.15 Fujitsu Ltd4.2.16 Google Inc.4.2.17 Hewlett Packard Enterprise4.2.18 Huawei Technologies Co. Ltd.4.2.19 IBM Corporation4.2.20 Intel Corporation4.2.21 Keysight Technologies4.2.22 Lockheed Martin Corporation4.2.23 Microsoft Corporation4.2.24 Mitsubishi Electric Corp.4.2.25 NEC Corporation4.2.26 Nokia Corporation4.2.27 NVidia4.2.28 Oracle Corporation4.2.29 Qualcomm Inc.4.2.30 Rackspace inc.4.3 Other Companies4.3.1 Samsung Electronics Co. Ltd.4.3.2 Toshiba Corporation4.3.3 Waters Corporation4.3.4 Gemalto N.V.4.3.5 Juniper Networks Inc.4.3.6 SAP SE4.3.7 Siemens AG4.3.8 Schneider Electric SE4.3.9 Raytheon Company4.3.10 1QB Information Technologies Inc.4.3.11 Cambridge Quantum Computing Ltd.4.3.12 MagiQ Technologies Inc.4.3.13 Rigetti Computing4.3.14 NTT Docomo Inc.4.3.15 Booz Allen Hamilton Inc.4.3.16 Airbus Group4.3.17 Volkswagen AG4.3.18 Iron.io4.3.19 Serverless Inc.4.3.20 LunchBadger4.3.21 CA Technologies4.3.22 TIBCO Software Inc.4.3.23 Salesforce

5.0 Next Generation Computing Market Analysis and Forecasts5.1 Overall Next Generation Computing Market5.2 Next Generation Computing Market by Segment5.3 High Performance Computing Market Forecasts5.4 Swarm Computing Market Forecasts5.5 Neuromorphic Computing Market Forecasts5.6 Biocomputing Market Forecasts5.7 Brain Computer Interface Market Forecasts5.8 Serverless Computing Market Forecasts5.9 Quantum Computing Market Forecasts5.10 Nanocomputing Market Forecasts5.11 NGC Market by Deployment Type5.12 NGC Market by Enterprise Type5.13 NGC Market by Connectivity Type5.14 AI Solution Market in NGC5.15 Big Data Analytics Solution Market in NGC5.16 NGC Market in IoT5.17 NGC Market in Edge Network5.18 NGC Market in Blockchain5.19 Next Generation Computing Market in Smart Cities5.20 Next Generation Computing Market in 5G5.21 Next Generation Computing Market by Region

6.0 Conclusions and Recommendations

For more information about this report visit https://www.researchandmarkets.com/r/l5j5dc

Media Contact:

Research and Markets Laura Wood, Senior Manager [emailprotected]

For E.S.T Office Hours Call +1-917-300-0470 For U.S./CAN Toll Free Call +1-800-526-8630 For GMT Office Hours Call +353-1-416-8900

U.S. Fax: 646-607-1907 Fax (outside U.S.): +353-1-481-1716

SOURCE Research and Markets

See the article here:
The Worldwide Next Generation Computing Industry is Expected to Reach $7 Billion by 2027 - PRNewswire

Riverlane taking quantum computing to fresh frontiers | Business Weekly – Business Weekly

Cambridge-based quantum engineering company Riverlane is at the heart of two related initiatives to troubleshoot problems and advance risk-free adoption worldwide.

It has head-hunted leading scientist Dr Earl Campbell to accelerate efforts to solve quantum error correction and only last month joined an influential consortium to build error corrected quantum processor.

As head of architecture, Dr Campbell will lead technical development to support the operating system for fault-tolerant quantum computers.

He joins Riverlane from Amazon Web Services Quantum Computing group, and has held a number of academic positions over the past 16 years. His game-changing efforts include leading contributions to quantum error correction, fault-tolerant quantum logic and compilation and quantum algorithms.

He has also made pioneering contributions to random compilers, including the qDRIFT algorithm, which is the only known efficient method for simulating systems with highly complex interactions.

Additionally, while working with IBM and University College London, Earl contributed to the development of near-Clifford emulators that were integrated into Qiskit IBMs open-source software development kit for quantum computers.

At Amazon Web Services he was a leading contributor to its paper proposing a novel quantum computing architecture and established a team working on quantum algorithms.

At Riverlane he will be working alongside leaders who have joined from Microsoft, ARM, Samsung, Intel and the White House! Backed by some of Europes leading venture-capital funds and the University of Cambridge, Riverlane is bringing together leading talent from the worlds of business, academia, and industry to design its modular operating system to work with all hardware providers, whatever the type of qubit.

Riverlane has already partnered with a third of the worlds quantum computing hardware companies, and has successfully tested Deltaflow.OS with multiple hardware approaches, including trapped ions and superconducting circuits.

Dr Campbell said: Error correction is the next defining challenge in quantum computing and we will need to deliver fast, effective software to solve it. Over the past 16 years, I have been tackling questions like this as an academic and Im looking forward to putting theory into practice.

Ive followed Riverlane since its early days and Ive always been drawn to challenging work with the promise of delivering widespread social and commercial impact. Im excited to join a diverse team with a proven track record in developing software used by hardware companies around the world.

Steve Brierley, CEO and founder of Riverlane added: Solving error correction will be key to unlocking quantum usefulness across a range of foundational challenges, including clean energy, drug discovery, material science, and advanced chemistry.

Were delighted that Earl is bringing his world-class expertise in this challenge to the Riverlane team to accelerate our efforts and unlock the potential of this technology.

Just before Christmas, Riverlane joined a 7.5 million consortium to build an error corrected quantum processor working with a range of UK partners, including Rolls-Royce to apply this toward new applications in the aerospace industry. The funding comes via the UK governments National Quantum Technologies Programme.

The project, led by quantum computer manufacturer Universal Quantum, calls on Riverlanes software and expertise to tackle quantum error correction on a trapped-ion quantum computer.

Error correction is a crucial step in unlocking the promise of fault tolerant quantum computers capable of a range of transformative applications, and is at the core of everything Riverlane does.

The work with Rolls-Royce will explore how quantum computers can develop practical applications toward the development of more sustainable and efficient jet engines.

This starts by applying quantum algorithms to take steps to toward a greater understanding of how liquids and gases flow, a field known as fluid dynamics. Simulating such flows accurately is beyond the computational capacity of even the most powerful classical computers today.

The consortium also includes: academic researchers from Imperial College London and the University of Sussex; the Science and Technology Facilities Council (STFC) Hartree Centre; supply chain partners Edwards, TMD Technologies and Diamond Microwave; and commercialisation and dissemination experts Sia Partners and Qureca.Fluids behave according to a famous set of partial differential equations called the Navier-Stokes equations, the solutions to which are important for aircraft and engine design, as well as understanding ocean currents and predicting the weather.

Classical computers can take months or even years to solve some types of these equations but recent research has shown that quantum computers could find the solutions much more quickly.

Link:
Riverlane taking quantum computing to fresh frontiers | Business Weekly - Business Weekly

A new language for quantum computing | MIT News | Massachusetts Institute of Technology – MIT News

Time crystals. Microwaves. Diamonds. What do these three disparate things have in common?

Quantum computing. Unlike traditional computers that use bits, quantum computers use qubits to encode information as zeros or ones, or both at the same time. Coupled with a cocktail of forces from quantum physics, these refrigerator-sized machines can process a whole lot of information but theyre far from flawless. Just like our regular computers, we need to have the right programming languages to properly compute on quantum computers.

Programming quantum computers requires awareness of something called entanglement, a computational multiplier for qubits of sorts, which translates to a lot of power. When two qubits are entangled, actions on one qubit can change the value of the other, even when they are physically separated, giving rise to Einsteins characterization of spooky action at a distance. But that potency is equal parts a source of weakness. When programming, discarding one qubit without being mindful of its entanglement with another qubit can destroy the data stored in the other, jeopardizing the correctness of the program.

Scientists from MITs Computer Science and Artificial Intelligence (CSAIL) aimed to do some unraveling by creating their own programming language for quantum computing called Twist. Twist can describe and verify which pieces of data are entangled in a quantum program, through a language a classical programmer can understand. The language uses a concept called purity, which enforces the absence of entanglement and results in more intuitive programs, with ideally fewer bugs. For example, a programmer can use Twist to say that the temporary data generated as garbage by a program is not entangled with the programs answer, making it safe to throw away.

While the nascent field can feel a little flashy and futuristic, with images of mammoth wiry gold machines coming to mind, quantum computers have potential for computational breakthroughs in classically unsolvable tasks, like cryptographic and communication protocols, search, and computational physics and chemistry. One of the key challenges in computational sciences is dealing with the complexity of the problem and the amount of computation needed. Whereas a classical digital computer would need a very large exponential number of bits to be able to process such a simulation, a quantum computer could do it, potentially, using a very small number of qubits if the right programs are there.

Our language Twist allows a developer to write safer quantum programs by explicitly stating when a qubit must not be entangled with another, says Charles Yuan, an MIT PhD student in electrical engineering and computer science and the lead author on a new paper about Twist. Because understanding quantum programs requires understanding entanglement, we hope that Twist paves the way to languages that make the unique challenges of quantum computing more accessible to programmers.

Yuan wrote the paper alongside Chris McNally, a PhD student in electrical engineering and computer science who is affiliated with the MIT Research Laboratory of Electronics, as well as MIT Assistant Professor Michael Carbin. They presented the research at last week's 2022 Symposium on Principles of Programming conference in Philadelphia.

Untangling quantum entanglement

Imagine a wooden box that has a thousand cables protruding out from one side. You can pull any cable all the way out of the box, or push it all the way in.

After you do this for a while, the cables form a pattern of bits zeros and ones depending on whether theyre in or out. This box represents the memory of a classical computer. A program for this computer is a sequence of instructions for when and how to pull on the cables.

Now imagine a second, identical-looking box. This time, you tug on a cable, and see that as it emerges, a couple of other cables are pulled back inside. Clearly, inside the box, these cables are somehow entangled with each other.

The second box is an analogy for a quantum computer, and understanding the meaning of a quantum program requires understanding the entanglement present in its data. But detecting entanglement is not straightforward. You cant see into the wooden box, so the best you can do is try pulling on cables and carefully reason about which are entangled. In the same way, quantum programmers today have to reason about entanglement by hand. This is where the design of Twist helps massage some of those interlaced pieces.

The scientists designed Twist to be expressive enough to write out programs for well-known quantum algorithms and identify bugs in their implementations. To evaluate Twist's design, they modified the programs to introduce some kind of bug that would be relatively subtle for a human programmer to detect, and showed that Twist could automatically identify the bugs and reject the programs.

They also measured how well the programs performed in practice in terms of runtime, which had less than 4 percent overhead over existing quantum programming techniques.

For those wary of quantums seedy reputation in its potential to break encryption systems, Yuan says its still not very well known to what extent quantum computers will actually be able to reach their performance promises in practice. There's a lot of research that's going on in post-quantum cryptography, which exists because even quantum computing is not all-powerful. So far, there's a very specific set of applications in which people have developed algorithms and techniques where a quantum computer can outperform classical computers.

An important next step is using Twist to create higher-level quantum programming languages. Most quantum programming languages today still resemble assembly language, stringing together low-level operations, without mindfulness towards things like data types and functions, and whats typical in classical software engineering.

Quantum computers are error-prone and difficult to program. By introducing and reasoning about the purity of program code, Twist takes a big step towards making quantum programming easier by guaranteeing that the quantum bits in a pure piece of code cannot be altered by bits not in that code, says Fred Chong, the Seymour Goodman Professor of Computer Science at the University of Chicago and chief scientist at Super.tech.

The work was supported, in part, by the MIT-IBM Watson AI Lab, the National Science Foundation, and the Office of Naval Research.

Read this article:
A new language for quantum computing | MIT News | Massachusetts Institute of Technology - MIT News

Quantum Computing in Silicon Breaks a Crucial Threshold for the First Time – Singularity Hub

Quantum computers made from the same raw materials as standard computer chips hold obvious promise, but so far theyve struggled with high error rates. That seems set to change after new research showed silicon qubits are now accurate enough to run a popular error-correcting code.

The quantum computers that garner all the headlines today tend to be made using superconducting qubits, such as those from Google and IBM, or trapped ions, such as those from IonQ and Honeywell. But despite their impressive feats, they take up entire rooms and have to be painstakingly handcrafted by some of the worlds brightest minds.

Thats why others are keen to piggyback on the miniaturization and fabrication breakthroughs weve made with conventional computer chips by building quantum processors out of silicon. Research has been going on in this area for years, and its unsurprisingly the route that Intel is taking in the quantum race. But despite progress, silicon qubits have been plagued by high error rates that have limited their usefulness.

The delicate nature of quantum states means that errors are a problem for all of these technologies, and error-correction schemes will be required for any of them to reach significant scale. But these schemes will only work if the error rates can be kept sufficiently low; essentially, you need to be able to correct errors faster than they appear.

The most promising family of error-correction schemes today are known as surface codes and they require operations on, or between, qubits to operate with a fidelity above 99 percent. That has long eluded silicon qubits, but in the latest issue of Nature three separate groups report breaking this crucial threshold.

The first two papers from researchers at RIKEN in Japan and QuTech, a collaboration between Delft University of Technology and the Netherlands Organization for Applied Scientific Research, use quantum dots for qubits. These are tiny traps made out of semiconductors that house a single electron. Information can be encoded into the qubits by manipulating the electrons spin, a fundamental property of elementary particles.

The key to both groups breakthroughs was primarily down to careful engineering of the qubits and control systems. But the QuTech group also used a diagnostic tool developed by researchers at Sandia National Laboratories to debug and fine-tune their system, while the RIKEN team discovered that upping the speed of operations boosted fidelity.

A third group from the University of New South Wales took a slightly different approach, using phosphorus atoms embedded into a silicon lattice as their qubits. These atoms can hold their quantum state for extremely long times compared to most other qubits, but the tradeoff is that its hard to get them to interact. The groups solution was to entangle two of these phosphorus atoms with an electron, which enables them to talk to each other.

All three groups were able to achieve fidelities above 99 percent for both single qubit and two-qubit operations, which crosses the error-correction threshold. They even managed to carry out some basic proof-of-principle calculations using their systems. Nonetheless, they are still a long way from making a fault-tolerant quantum processor out of silicon.

Achieving high-fidelity qubit operations is only one of the requirements for effective error correction. The other is having a large number of spare qubits that can be dedicated to this task, while the remaining ones focus on whatever problem the processor has been set.

As an accompanying analysis in Nature notes, adding more qubits to these systems is certain to complicate things, and maintaining the same fidelities in larger systems will be tough. Finding ways to connect qubits across large systems will also be a challenge.

However, the promise of being able to build compact quantum computers using the same tried-and-true technology as existing computers suggests these are problems worth trying to solve.

Image Credit: UNSW/Tony Melov

Read the rest here:
Quantum Computing in Silicon Breaks a Crucial Threshold for the First Time - Singularity Hub