Archive for the ‘Quantum Computing’ Category

Which Types Of Encryption Will Remain Secure As Quantum Computing Develops – And Which Popular Ones Will Not – Joseph Steinberg

As I discussed last month, unless we take actions soon, a tremendous amount of data that is today protected through the use of encryption will become vulnerable to exposure.

The reason that such a major threat exists is simple much of todays data relies on the security of what are known as asymmetric encryption algorithms, and such algorithms rely for their security on the fact that the mathematics that they use to encrypt cannot easily be reversed in order to decrypt. (For those interested in the details: the most common difficult-to-reverse mathematics employed by asymmetric encryption systems are integer factorization, discrete logarithms, and elliptic-curve discrete logarithms).

While todays computers cannot efficiently crack asymmetric encryption through the use of brute force trying all possible values in order to discover a correct key could literally take centuries, and there are no shortcuts to doing so we have already seen the dawn of so-called quantum computers devices that leverage advanced physics to perform computing functions on large sets of data in super-efficient ways that are completely unachievable with classic computers. While it has long been believed that quantum computers could potentially undermine the integrity of various forms of encryption, in 1994, an American mathematician by the name of Peter Shor showed how a quantum algorithm could quickly solve integer factorization problems transforming a theoretical risk into a time bomb. It became clear then that a powerful quantum computer utilizing Shors Algorithm could both make mincemeat out of modern encryption systems, as well as trivialize the performance of various other forms of complex math and, since then, we have already seen this happen. Just a few years ago, Googles early-generation quantum computer, Sycamore, for example, performed a calculation in 200 seconds that many experts believe would have taken the worlds then-most-powerful-classic-supercomputer, IBM Summit, somewhere between multiple days and multiple millennia to complete. Yes, 200 seconds for a de facto prototype vs multiple millennia for a mature super computer.

To protect data in the quantum computing era, therefore, we must change how we encrypt. To help the world achieve such an objective, the US National Institute of Standards and Technology (NIST) has been running a competition since 2016 to develop new quantum-proof standards for cryptography winners are expected to be announced sometime in the next year, and multiple approaches are expected to be endorsed.

Some quantum-safe encryption methods that appear to be among the likely candidates to be selected by NIST employ what are known as lattice approaches employing math that, at least as of today, we do not know how to undermine with quantum algorithms. While lattice approaches are likely to prove popular methods of addressing quantum supremacy in the near term, there is concern that some of their security might stem from their newness, and, that over time, mathematicians may discover quantum algorithms that render them potentially crackable.

Other candidates for NISTs approval utilize what is known as code-based encryption a time-tested method introduced in 1978 by Caltech Professor of Engineering, Robert McEliece; code-based encryption employs an error-correcting code, keys modified with linear transformations, and random junk data; while it is simple for parties with the decryption keys to remove the junk and decrypt, unauthorized parties seeking to decrypt face a huge challenge that remains effectively unsolvable by quantum algorithms, even after decades of analysis.

NISTs candidates also utilize various other encryption approaches that, at least as of now, appear to be quantum safe.

Of course, security is not the only factor when it comes to deciding how to encrypt practicality plays a big role as well. Any quantum-safe encryption approach that is going to be successful must be usable by the masses; especially as the world experiences the proliferation of smart devices constrained by minimal processing power, memory, and bandwidth, mathematical complexity and/or large minimum key sizes can render useless otherwise great encryption options.

In short, many of todays popular asymmetric encryption methods (RSA, ECC, etc.) will be easily crackable by quantum computers in the not-so-distant future. (Modern asymmetric systems typically use asymmetric encryption to exchange keys that are then used for symmetric encryption if the asymmetric part is not secure, the symmetric part is not either.) To address such risks we have quantum-safe encryption, a term that refers to encryption algorithms and systems, many of which already exist, that are believed to be resilient to cracking attempts performed by quantum computers.

While NIST is working on establishing preferred methods of quantum-safe encryption, sensitive data is already, now, being put at risk by quantum supremacy; as such, for many organizations, waiting for NIST may turn out to be a costly mistake. Additionally, the likely rush to retrofit existing systems with new encryption methods once NIST does produce recommendations may drive up the costs of related projects in terms of both time and money. With quantum-safe encryption solutions that leverage approaches submitted to NIST already available and running on todays computers, the time to start thinking about quantum risks is not somewhere down the road, but now.

This post is sponsored byIronCAP. Please click the link to learn more about IronCAPs patent protected methods of keeping data safe against not only against todays cyberattacks, but also against future attacks from quantum computers.

See original here:
Which Types Of Encryption Will Remain Secure As Quantum Computing Develops - And Which Popular Ones Will Not - Joseph Steinberg

D-Wave Joins the Hudson Institute’s Quantum Alliance Initiative – HPCwire

PALO ALTO, Calif. Feb. 1, 2021 D-Wave Government Inc., a leader in quantum computing systems, software, and services, and the only company developing both annealing and gate-model quantum computers, today announced they have joined the Hudson Institutes Quantum Alliance Initiative(QAI), a consortium of companies, institutions, and universities whose mission is to raise awareness and develop policies that promote the critical importance of U.S. leadership in quantum technology.

The collaboration between the two organizations is a natural next step for D-Wave, which is well-known for developing the worlds first commercial quantum computer and continues to encourage practical quantum computing use cases among enterprise, academic, and government customers. As the only quantum computing company developing both annealing and gate-model quantum computers, D-Wave offers a unique perspective on the importance of inclusive policies that allow for access across quantum technologies.

D-Wave continues to be a leader in quantum policy thought leadership, working to expand accessibility to the technology, educate on different capabilities for technological advancements, promote workforce development to address the industry talent gap, and foster public-private partnerships aimed at solving key public sector needs. By joining the Hudson Institutes QAI, the company will connect with a consortium whose mission is to raise public awareness among global governments to promote quantum policies and government programs which support and foster a robust quantum industry.

We are delighted to have D-Wave join us as our newest sponsoring member of the Quantum Alliance Initiative, says the Hudson Institute programs director Arthur Herman, D-Wave was one of the earliest pioneers in bringing quantum-based technology directly into the mainstream commercial sector.Quantum information science will dominate the 21stcentury; we at QAI are happy to have D-Wave joining us in helping to shape that future.

D-Waves mission has always been centered on practical quantum computing and building technology that businesses, governments, universities, and other organizations across the globe can harness to create real-world value and impact, today. Joining QAIs impressive international quantum community will allow the company to continue championing policies that will further quantum computings development, progress, and future on an international political stage.

D-Wave is proud to join the other members of the Quantum Alliance Initiative in fostering an increased understanding of current quantum capabilities and to support policy initiatives for the industry, said Allison Schwartz, Vice President, Global Government Relations & Public Affairs at D-Wave. QAI has worked with global policy makers to increase quantum education, promote use of the technology, and showcase viable use cases today and in the future. Through this relationship, D-Wave will add to the discussions around quantum policy initiatives and contribute to an expanded global understanding of the industry and technology capabilities.

To learn more about D-Waves quantum technology and use cases, visit theirwebsite. For more information on Hudson Institutes QAI, clickhere.

About D-Wave Government Inc.

D-Wave is the leader in the development and delivery of quantum computing technology, software, and services, and the worlds first commercial supplier of quantum computers. D-Wave Government Inc., a U.S. subsidiary, was formed in 2013 to provide D-Waves quantum computing technology to the U.S. government. D-Waves quantum technology has been used by some of the worlds most advanced organizations including Forschungszentrum Jlich, Lockheed Martin, Google, NASA Ames, Oak Ridge National Laboratory, and Los Alamos National Laboratory. D-Wave has been granted more than 200 US patents and has published over 100 scientific papers, many of which have appeared in leading science journals including Nature, Science and Nature Communications.

Source: D-Wave

Read the rest here:
D-Wave Joins the Hudson Institute's Quantum Alliance Initiative - HPCwire

Meet the NSA spies shaping the future – MIT Technology Review

Future history

The NSAs Research Directorate is descended from the Black Chamber, the first group of civilian codebreakers in the United States who were tasked with spying on cutting-edge technology, like the telegraph. Existing only from 1919 to 1929, the group decoded over 10,000 messages from a dozen nations, according to James Bamfords 2001 book Body of Secrets: Anatomy of the Ultra-Secret National Security Agency. In addition to groundbreaking cryptanalytic work, the group succeeded by securing surveillance help from American cable companies like Western Union that could supply the newly minted US spies with sensitive communications to examine.

The Black Chamber was shut down amid scandal when US Secretary of State Henry Stimson found out the group was spying on American allies as well as foes. The incident foreshadowed the 1975 Church Committee, which investigated surveillance abuses by American intelligence agencies, and the 2013 Snowden leaks, which exposed vast electronic surveillance capabilities that triggered a global reckoning.

Just eight months after the Black Chamber was shuttered, the US, faced with the prospect of crippled spying capabilities in the increasingly unstable world of the 1930s, reformed the effort under the Armys Signals Intelligence Service. One of just three people working with the Black Chambers old records, one of the founders of the SIS, which Bamford reports was kept a secret from the State Department, was the mathematician Solomon Kullback.

Kullback was instrumental in breaking both Japanese and German codes before and during World War II, and he later directed the research and development arm of the newly formed National Security Agency. Within a year, that evolved into the directorate as we know it today: a distinct space for research that is not disrupted by the daily work of the agency.

Its important to have a research organization, even in a mission-driven organization, to be thinking beyond a crisis, says Herrera, though he adds that the directorate does dedicate some of its work to the crisis of the day. It runs a program called scientists on call, which allows NSA mission analysts facing technical challenges while interrogating information to ask for help via email, giving them access to hundreds of scientists.

But the lions share of the directorates work is envisioning the technologies that are generations ahead of what we have today. It operates almost like a small, elite technical college, organized around five academic departmentsmath, physics, cyber, computer science, and electrical engineeringeach staffed with 100 to 200 people.

The cybersecurity department defends the federal governments national security and the countrys military-industrial base. This is the highest-profile department, and deliberately so. Over the last five years, the previously shadowy NSA has become more vocal and active in cybersecurity. It has launched public advisories and research projects that would once have been anathema for an organization whose existence wasnt even acknowledged until 20 years after its founding.

Now the products of NSA research, like Ghidra, a free, sophisticated reverse engineering tool that helps in the technical dissection of hacking tools, as well as other software, are popular, trusted, and in use around the world. They serve as powerful cybersecurity tools, a recruiting pitch, and a public relations play all wrapped into one.

The physics department, which Herrera once directed, runs dozens of laboratories that conduct most of the work on quantum information sciences, but it has a much wider remit than that. As physical limits in the ability to squeeze more transistors into chips threaten to slow and halt 60 years of predictably rapid computing growth, its physicists are exploring new materials and novel computing architectures to drive the next generation of computing into a less predictable future, exactly the kind of task the directorate was given when it first came into existence.

Meanwhile, the electrical engineering department has been looking closely at the physics and engineering of telecommunications networks since the internet first arose. As well as the issues around 5G, it also tackles every facet of the digital world, from undersea cables to satellite communications.

Some prospects on the horizon dont fit neatly into any particular box. The computer science departments work on artificial intelligence and machine learning, for example, cuts across cybersecurity missions and data analysis work with the mathematicians.

Herrera repeatedly raises the prospect of the directorate needing to develop greater capabilities in and understanding of rapidly advancing fields like synthetic biology. The NSA is hardly alone in this: Chinese military leaders have called biotech a priority for national defense.

Much of the competition in the world now is not military, Herrera says. Military competition is accelerating, but there is also dissemination of other technologies, like synthetic biologies, that are frankly alarming. The role of research is to help the NSA understand what the impact of those technologies will be. How much we actually get involved, I dont know, but these are areas we have to keep an eye on.

Original post:
Meet the NSA spies shaping the future - MIT Technology Review

The Worldwide Next Generation Computing Industry is Expected to Reach $7 Billion by 2027 – PRNewswire

DUBLIN, Feb. 1, 2022 /PRNewswire/ -- The "Next Generation Computing Market: Bio-Computing, Brain-Computer Interfaces, High Performance Computing, Nanocomputing, Neuromorphic Computing, Serverless Computing, Swarm Computing, and Quantum Computing 2022 - 2027" report has been added to ResearchAndMarkets.com's offering.

This next generation computing market report evaluates next generation computing technologies, use cases, and applications. Market readiness factors are considered along with the impact of different computational methods upon other emerging technologies.

The report provides analysis of leading-edge developments such as computer integration with human cognition via bio-computing and brain-computer interfaces. Other pioneering areas covered include leveraging developments in nanotechnology to develop more effective computing models and methods.

The report includes critical analysis of leading vendors and strategies. The report includes next generation computing market sizing for the period of 2022 - 2027.

Select Report Findings:

There are many technologies involved, including distributed computing (swarm computing), computational collaboration (bio-computing), improving performance of existing supercomputers, and completely new computer architectures such as those associated with quantum computing. Each of these approaches has their own advantages and disadvantages. Many of these different computing architectures and methods stand alone in terms of their ability to solve market problems.

Next generation computing technologies covered in this report include:

More than simply an amalgamation of technologies, the next generation computing market is characterized by many different approaches to solve a plethora of computational challenges. Common factors driving the market include the need for ever increasing computation speed and efficiency, reduced energy consumption, miniaturization, evolving architectures and business models.

High-performance Computing

High-performance computing (HPC) solves complex computational problems using supercomputers and parallel computational techniques, processing algorithms and systems. HPC leverages various techniques including computer modeling, simulation, and analysis to solve advanced computational problems and perform research activities while allowing usage of computing resources concurrently.

Quantum Computing

The commercial introduction of quantum computing is anticipated to both solve and create new problems as previously unsolvable problems will be solved. This multiplicity of developments with next generation computing makes it difficult for the enterprise or government user to make decisions about infrastructure, software, and services.

Biocomputing

Biocomputing refers to the construction and use of computers using biologically derived molecules including DNA and proteins to perform computational calculations such as storing, retrieving and processing data. The computing system functions more like a living organism or contains biological components.

Neuromorphic Computing

Neuromorphic computing refers to the implementation of neural systems such as perception, motor control, and multisensory integration for very large-scale integration systems combining analog circuits or digital circuits or mixed mode circuits, and software systems.

Neuromorphic computing leverages the techniques of neuromorphic engineering that takes inspiration from biology, physics, mathematics, computer science, and electronic engineering to develop artificial neural systems including vision systems, head-eye systems, auditory processors, and autonomous robots.

Nanocomputing

Nanocomputing refers to miniature computing devices (within 100 nanometers) that are used to perform critical tasks like representation and manipulation of data. Nanocomputing is expected to bring revolution in the way traditional computing is used in certain key industry verticals, allowing progress in device technology, computer architectures, and IC processing. This technology area will help to substantially progress implantable technologies inserted into the human body, primarily for various healthcare solutions.

Key Topics Covered:

1.0 Executive Summary

2.0 Introduction

3.0 Technology and Application Analysis3.1 High Performance Computing3.1.1 HPC Technology3.1.2 Exascale Computation3.1.2.1 Exascale Supercomputer Development3.1.2.1.1 United States3.1.2.1.2 China3.1.2.1.3 Europe3.1.2.1.4 Japan3.1.2.1.5 India3.1.2.1.6 Taiwan3.1.3 Supercomputers3.1.4 High Performance Technical Computing3.1.5 Market Segmentation Considerations3.1.6 Use Cases and Application Areas3.1.6.1 Computer Aided Engineering3.1.6.2 Government3.1.6.3 Financial Services3.1.6.4 Education and Research3.1.6.5 Manufacturing3.1.6.6 Media and Entertainment3.1.6.7 Electronic Design Automation3.1.6.8 Bio-Sciences and Healthcare3.1.6.9 Energy Management and Utilities3.1.6.10 Earth Science3.1.7 Regulatory Framework3.1.8 Value Chain Analysis3.1.9 AI to Drive HPC Performance and Adoption3.2 Swarm Computing3.2.1 Swarm Computing Technology3.2.1.1 Ant Colony Optimization3.2.1.2 Particle Swarm Optimization3.2.1.3 Stochastic Diffusion Search3.2.2 Swarm Intelligence3.2.3 Swarm Computing Capabilities3.2.4 Value Chain Analysis3.2.5 Regulatory Framework3.3 Neuromorphic Computing3.3.1 Neuromorphic Computing Technology3.3.2 Neuromorphic Semiconductor3.3.2.1 Hardware Neurons3.3.2.2 Implanted Memory3.3.3 Neuromorphic Application3.3.4 Neuromorphic Market Explained3.3.5 Value Chain Analysis3.4 Biocomputing3.4.1 Bioinformatics3.4.2 Computational Biology and Drug Discovery3.4.3 Biodata Mining and Protein Simulations3.4.4 Biocomputing Platform and Services3.4.5 Biocomputing Application3.4.6 Biocomputing Products3.4.7 Value Chain Analysis3.5 Quantum Computing3.5.1 Quantum Simulation, Sensing and Communication3.5.2 Quantum Cryptography3.5.3 Quantum Computing Technology3.5.4 Quantum Programming, Software and SDK3.5.5 Quantum Computing Application3.5.6 Value Chain Analysis3.6 Serverless Computing3.6.1 Serverless Computing Solution3.6.2 Serverless Computing Application3.6.2.1 Event Driven Computing3.6.2.2 Live Video Broadcasting3.6.2.3 Processing IoT Data3.6.2.4 Shared Delivery Dispatch System3.6.2.5 Web Application and Bakends3.6.2.6 Application Scalability3.6.2.7 Sales opportunities and Customer Experience3.6.3 Value Chain Analysis3.7 Brain Computer Interface Technology3.7.1 BCI Overview3.7.2 Invasive vs. Non-Invasive BCI3.7.3 Partially Invasive BCI3.7.4 BCI Applications3.7.5 Silicon Electronics3.7.6 Value Chain Analysis3.8 Nanocomputing3.8.1 Nanotechnology3.8.2 Nanomaterials3.8.3 DNA Nanocomputing3.8.4 Nanocomputing Market3.8.5 Value Chain3.9 Artificial Intelligence and IoT3.10 Edge Computing Network and 5G3.11 Blockchain and Virtualization3.12 Green Computing3.13 Cognitive Computing

4.0 Company Analysis4.1 Vendor Ecosystem4.2 Leading Company4.2.1 ABM Inc.4.2.2 Advanced Brain Monitoring Inc.4.2.3 Advanced Diamond Technologies Inc.4.2.4 Agilent Technologies Inc.4.2.5 Alibaba Group Holding Limited4.2.6 Amazon Web Services Inc.4.2.7 Apium Swarm Robotics4.2.8 Atos SE4.2.9 Advanced Micro Devices Inc.4.2.10 Robert Bosch GmbH4.2.11 Cisco Systems4.2.12 D-Wave Systems Inc.4.2.13 DELL Technologies Inc.4.2.14 Emotiv4.2.15 Fujitsu Ltd4.2.16 Google Inc.4.2.17 Hewlett Packard Enterprise4.2.18 Huawei Technologies Co. Ltd.4.2.19 IBM Corporation4.2.20 Intel Corporation4.2.21 Keysight Technologies4.2.22 Lockheed Martin Corporation4.2.23 Microsoft Corporation4.2.24 Mitsubishi Electric Corp.4.2.25 NEC Corporation4.2.26 Nokia Corporation4.2.27 NVidia4.2.28 Oracle Corporation4.2.29 Qualcomm Inc.4.2.30 Rackspace inc.4.3 Other Companies4.3.1 Samsung Electronics Co. Ltd.4.3.2 Toshiba Corporation4.3.3 Waters Corporation4.3.4 Gemalto N.V.4.3.5 Juniper Networks Inc.4.3.6 SAP SE4.3.7 Siemens AG4.3.8 Schneider Electric SE4.3.9 Raytheon Company4.3.10 1QB Information Technologies Inc.4.3.11 Cambridge Quantum Computing Ltd.4.3.12 MagiQ Technologies Inc.4.3.13 Rigetti Computing4.3.14 NTT Docomo Inc.4.3.15 Booz Allen Hamilton Inc.4.3.16 Airbus Group4.3.17 Volkswagen AG4.3.18 Iron.io4.3.19 Serverless Inc.4.3.20 LunchBadger4.3.21 CA Technologies4.3.22 TIBCO Software Inc.4.3.23 Salesforce

5.0 Next Generation Computing Market Analysis and Forecasts5.1 Overall Next Generation Computing Market5.2 Next Generation Computing Market by Segment5.3 High Performance Computing Market Forecasts5.4 Swarm Computing Market Forecasts5.5 Neuromorphic Computing Market Forecasts5.6 Biocomputing Market Forecasts5.7 Brain Computer Interface Market Forecasts5.8 Serverless Computing Market Forecasts5.9 Quantum Computing Market Forecasts5.10 Nanocomputing Market Forecasts5.11 NGC Market by Deployment Type5.12 NGC Market by Enterprise Type5.13 NGC Market by Connectivity Type5.14 AI Solution Market in NGC5.15 Big Data Analytics Solution Market in NGC5.16 NGC Market in IoT5.17 NGC Market in Edge Network5.18 NGC Market in Blockchain5.19 Next Generation Computing Market in Smart Cities5.20 Next Generation Computing Market in 5G5.21 Next Generation Computing Market by Region

6.0 Conclusions and Recommendations

For more information about this report visit https://www.researchandmarkets.com/r/l5j5dc

Media Contact:

Research and Markets Laura Wood, Senior Manager [emailprotected]

For E.S.T Office Hours Call +1-917-300-0470 For U.S./CAN Toll Free Call +1-800-526-8630 For GMT Office Hours Call +353-1-416-8900

U.S. Fax: 646-607-1907 Fax (outside U.S.): +353-1-481-1716

SOURCE Research and Markets

See the article here:
The Worldwide Next Generation Computing Industry is Expected to Reach $7 Billion by 2027 - PRNewswire

Riverlane taking quantum computing to fresh frontiers | Business Weekly – Business Weekly

Cambridge-based quantum engineering company Riverlane is at the heart of two related initiatives to troubleshoot problems and advance risk-free adoption worldwide.

It has head-hunted leading scientist Dr Earl Campbell to accelerate efforts to solve quantum error correction and only last month joined an influential consortium to build error corrected quantum processor.

As head of architecture, Dr Campbell will lead technical development to support the operating system for fault-tolerant quantum computers.

He joins Riverlane from Amazon Web Services Quantum Computing group, and has held a number of academic positions over the past 16 years. His game-changing efforts include leading contributions to quantum error correction, fault-tolerant quantum logic and compilation and quantum algorithms.

He has also made pioneering contributions to random compilers, including the qDRIFT algorithm, which is the only known efficient method for simulating systems with highly complex interactions.

Additionally, while working with IBM and University College London, Earl contributed to the development of near-Clifford emulators that were integrated into Qiskit IBMs open-source software development kit for quantum computers.

At Amazon Web Services he was a leading contributor to its paper proposing a novel quantum computing architecture and established a team working on quantum algorithms.

At Riverlane he will be working alongside leaders who have joined from Microsoft, ARM, Samsung, Intel and the White House! Backed by some of Europes leading venture-capital funds and the University of Cambridge, Riverlane is bringing together leading talent from the worlds of business, academia, and industry to design its modular operating system to work with all hardware providers, whatever the type of qubit.

Riverlane has already partnered with a third of the worlds quantum computing hardware companies, and has successfully tested Deltaflow.OS with multiple hardware approaches, including trapped ions and superconducting circuits.

Dr Campbell said: Error correction is the next defining challenge in quantum computing and we will need to deliver fast, effective software to solve it. Over the past 16 years, I have been tackling questions like this as an academic and Im looking forward to putting theory into practice.

Ive followed Riverlane since its early days and Ive always been drawn to challenging work with the promise of delivering widespread social and commercial impact. Im excited to join a diverse team with a proven track record in developing software used by hardware companies around the world.

Steve Brierley, CEO and founder of Riverlane added: Solving error correction will be key to unlocking quantum usefulness across a range of foundational challenges, including clean energy, drug discovery, material science, and advanced chemistry.

Were delighted that Earl is bringing his world-class expertise in this challenge to the Riverlane team to accelerate our efforts and unlock the potential of this technology.

Just before Christmas, Riverlane joined a 7.5 million consortium to build an error corrected quantum processor working with a range of UK partners, including Rolls-Royce to apply this toward new applications in the aerospace industry. The funding comes via the UK governments National Quantum Technologies Programme.

The project, led by quantum computer manufacturer Universal Quantum, calls on Riverlanes software and expertise to tackle quantum error correction on a trapped-ion quantum computer.

Error correction is a crucial step in unlocking the promise of fault tolerant quantum computers capable of a range of transformative applications, and is at the core of everything Riverlane does.

The work with Rolls-Royce will explore how quantum computers can develop practical applications toward the development of more sustainable and efficient jet engines.

This starts by applying quantum algorithms to take steps to toward a greater understanding of how liquids and gases flow, a field known as fluid dynamics. Simulating such flows accurately is beyond the computational capacity of even the most powerful classical computers today.

The consortium also includes: academic researchers from Imperial College London and the University of Sussex; the Science and Technology Facilities Council (STFC) Hartree Centre; supply chain partners Edwards, TMD Technologies and Diamond Microwave; and commercialisation and dissemination experts Sia Partners and Qureca.Fluids behave according to a famous set of partial differential equations called the Navier-Stokes equations, the solutions to which are important for aircraft and engine design, as well as understanding ocean currents and predicting the weather.

Classical computers can take months or even years to solve some types of these equations but recent research has shown that quantum computers could find the solutions much more quickly.

Link:
Riverlane taking quantum computing to fresh frontiers | Business Weekly - Business Weekly