Archive for the ‘Quantum Computer’ Category

University of Glasgow Partners with Oxford Instruments NanoScience on Quantum Computing – HPCwire

Jan. 21, 2021 Today, the University of Glasgow, a pioneering institution in quantum technology development and home of the Quantum Circuits Group, announced its using Oxford Instruments next generation Cryofree refrigerator, Proteox, as part of its research to accelerate the commercialisation of quantum computing in the UK.

Were excited to be using Proteox, the latest in cryogen-free refrigeration technology, and to have the system up and running in our lab, comments Professor Martin Weides, Head of the Quantum Circuits Group. Oxford Instruments is a long-term strategic partner and todays announcement highlights the importance of our close collaboration to the future of quantum computing development. Proteox is designed with quantum scale-up in mind, and through the use of its Secondary Insert technology, were able to easily characterise and develop integrated chips and components for quantum computing applications.

The University of Glasgow, its subsidiary and commercialisation partner, Kelvin Nanotechnology, and Oxford Instruments NanoScience are part of a larger consortium supported by funding from Innovate UK, the UKs innovation agency, granted in April 2020. The consortium partners will boost quantum technology development by the design, manufacture, and test of superconducting quantum devices.

Todays announcement demonstrates the major contribution Oxford Instruments is making towards pioneering quantum technology work in the UK, states Stuart Woods, Managing Director of Oxford Instruments NanoScience. With our 60 years of experience of in-house component production and global service support, we are accelerating the commercialisation of quantum to discover whats next supporting our customers across the world.

Proteox is a next-generation Cryofree system that provides a step change in modularity and adaptability for ultra-low temperature experiments in condensed-matter physics and quantum computing industrialisation. The Proteox platform has been developed to provide a single, interchangeable modular solution that can support multiple users and a variety of set-ups or experiments. It also includes remote management software which is integral to the system design, enabling, for example, the system to be managed from anywhere in the world. To find out more, visit nanoscience.oxinst.com/proteox.

About Oxford Instruments NanoScience

Oxford Instruments NanoScience designs, supplies and supports market-leading research tools that enable quantum technologies, new materials and device development in the physical sciences. Our tools support research down to the atomic scale through creation of high performance, cryogen-free low temperature and magnetic environments, based upon our core technologies in low and ultra-low temperatures, high magnetic fields and system integration, with ever-increasing levels of experimental and measurement readiness.Oxford Instruments NanoScience is a part of the Oxford Instruments plc group.

Glasgows Quantum Circuit Group is found here: https://www.gla.ac.uk/schools/engineering/research/divisions/ene/researchthemes/micronanotechnology/quantumcircuits/

Source: University of Glasgow

More:
University of Glasgow Partners with Oxford Instruments NanoScience on Quantum Computing - HPCwire

The Worldwide Quantum Computing Industry will Exceed $7.1 Billion by 2026 – GlobeNewswire

Dublin, Jan. 19, 2021 (GLOBE NEWSWIRE) -- The "Quantum Computing Market by Technology, Infrastructure, Services, and Industry Verticals 2021 - 2026" report has been added to ResearchAndMarkets.com's offering.

This report assesses the technology, companies/organizations, R&D efforts, and potential solutions facilitated by quantum computing. The report provides global and regional forecasts as well as the outlook for quantum computing impact on infrastructure including hardware, software, applications, and services from 2021 to 2026. This includes the quantum computing market across major industry verticals.

While classical (non-quantum) computers make the modern digital world possible, there are many tasks that cannot be solved using conventional computational methods. This is because of limitations in processing power. For example, fourth-generation computers cannot perform multiple computations at one time with one processor. Physical phenomena at the nanoscale indicate that a quantum computer is capable of computational feats that are orders of magnitude greater than conventional methods.

This is due to the use of something referred to as a quantum bit (qubit), which may exist as a zero or one (as in classical computing) or may exist in two-states simultaneously (0 and 1 at the same time) due to the superposition principle of quantum physics. This enables greater processing power than the normal binary (zero only or one only) representation of data.

Whereas parallel computing is achieved in classical computers via linking processors together, quantum computers may conduct multiple computations with a single processor. This is referred to as quantum parallelism and is a major difference between hyper-fast quantum computers and speed-limited classical computers.

Quantum computing is anticipated to support many new and enhanced capabilities including:

Target Audience:

Select Report Findings:

Report Benefits:

Key Topics Covered:

1.0 Executive Summary

2.0 Introduction

3.0 Technology and Market Analysis3.1 Quantum Computing State of the Industry3.2 Quantum Computing Technology Stack3.3 Quantum Computing and Artificial Intelligence3.4 Quantum Neurons3.5 Quantum Computing and Big Data3.6 Linear Optical Quantum Computing3.7 Quantum Computing Business Model3.8 Quantum Software Platform3.9 Application Areas3.10 Emerging Revenue Sectors3.11 Quantum Computing Investment Analysis3.12 Quantum Computing Initiatives by Country3.12.1 USA3.12.2 Canada3.12.3 Mexico3.12.4 Brazil3.12.5 UK3.12.6 France3.12.7 Russia3.12.8 Germany3.12.9 Netherlands3.12.10 Denmark3.12.11 Sweden3.12.12 Saudi Arabia3.12.13 UAE3.12.14 Qatar3.12.15 Kuwait3.12.16 Israel3.12.17 Australia3.12.18 China3.12.19 Japan3.12.20 India3.12.21 Singapore

4.0 Quantum Computing Drivers and Challenges4.1 Quantum Computing Market Dynamics4.2 Quantum Computing Market Drivers4.2.1 Growing Adoption in Aerospace and Defense Sectors4.2.2 Growing investment of Governments4.2.3 Emergence of Advance Applications4.3 Quantum Computing Market Challenges

5.0 Quantum Computing Use Cases5.1 Quantum Computing in Pharmaceuticals5.2 Applying Quantum Technology to Financial Problems5.3 Accelerate Autonomous Vehicles with Quantum AI5.4 Car Manufacturers using Quantum Computing5.5 Accelerating Advanced Computing for NASA Missions

6.0 Quantum Computing Value Chain Analysis6.1 Quantum Computing Value Chain Structure6.2 Quantum Computing Competitive Analysis6.2.1 Leading Vendor Efforts6.2.2 Start-up Companies6.2.3 Government Initiatives6.2.4 University Initiatives6.2.5 Venture Capital Investments6.3 Large Scale Computing Systems

7.0 Company Analysis7.1 D-Wave Systems Inc.7.1.1 Company Overview:7.1.2 Product Portfolio7.1.3 Recent Development7.2 Google Inc.7.2.1 Company Overview:7.2.2 Product Portfolio7.2.3 Recent Development7.3 Microsoft Corporation7.3.1 Company Overview:7.3.2 Product Portfolio7.3.3 Recent Development7.4 IBM Corporation7.4.1 Company Overview:7.4.2 Product Portfolio7.4.3 Recent Development7.5 Intel Corporation7.5.1 Company Overview7.5.2 Product Portfolio7.5.3 Recent Development7.6 Nokia Corporation7.6.1 Company Overview7.6.2 Product Portfolio7.6.3 Recent Developments7.7 Toshiba Corporation7.7.1 Company Overview7.7.2 Product Portfolio7.7.3 Recent Development7.8 Raytheon Company7.8.1 Company Overview7.8.2 Product Portfolio7.8.3 Recent Development7.9 Other Companies7.9.1 1QB Information Technologies Inc.7.9.1.1 Company Overview7.9.1.2 Recent Development7.9.2 Cambridge Quantum Computing Ltd.7.9.2.1 Company Overview7.9.2.2 Recent Development7.9.3 QC Ware Corp.7.9.3.1 Company Overview7.9.3.2 Recent Development7.9.4 MagiQ Technologies Inc.7.9.4.1 Company Overview7.9.5 Rigetti Computing7.9.5.1 Company Overview7.9.5.2 Recent Development7.9.6 Anyon Systems Inc.7.9.6.1 Company Overview7.9.7 Quantum Circuits Inc.7.9.7.1 Company Overview7.9.7.2 Recent Development7.9.8 Hewlett Packard Enterprise (HPE)7.9.8.1 Company Overview7.9.8.2 Recent Development7.9.9 Fujitsu Ltd.7.9.9.1 Company Overview7.9.9.2 Recent Development7.9.10 NEC Corporation7.9.10.1 Company Overview7.9.10.2 Recent Development7.9.11 SK Telecom7.9.11.1 Company Overview7.9.11.2 Recent Development7.9.12 Lockheed Martin Corporation7.9.12.1 Company Overview7.9.13 NTT Docomo Inc.7.9.13.1 Company Overview7.9.13.2 Recent Development7.9.14 Alibaba Group Holding Limited7.9.14.1 Company Overview7.9.14.2 Recent Development7.9.15 Booz Allen Hamilton Inc.7.9.15.1 Company Overview7.9.16 Airbus Group7.9.16.1 Company Overview7.9.16.2 Recent Development7.9.17 Amgen Inc.7.9.17.1 Company Overview7.9.17.2 Recent Development7.9.18 Biogen Inc.7.9.18.1 Company Overview7.9.18.2 Recent Development7.9.19 BT Group7.9.19.1 Company Overview7.9.19.2 Recent Development7.9.20 Mitsubishi Electric Corp.7.9.20.1 Company Overview7.9.21 Volkswagen AG7.9.21.1 Company Overview7.9.21.2 Recent Development7.9.22 KPN7.9.22.1 Recent Development7.10 Ecosystem Contributors7.10.1 Agilent Technologies7.10.2 Artiste-qb.net7.10.3 Avago Technologies7.10.4 Ciena Corporation7.10.5 Eagle Power Technologies Inc7.10.6 Emcore Corporation7.10.7 Enablence Technologies7.10.8 Entanglement Partners7.10.9 Fathom Computing7.10.10 Alpine Quantum Technologies GmbH7.10.11 Atom Computing7.10.12 Black Brane Systems7.10.13 Delft Circuits7.10.14 EeroQ7.10.15 Everettian Technologies7.10.16 EvolutionQ7.10.17 H-Bar Consultants7.10.18 Horizon Quantum Computing7.10.19 ID Quantique (IDQ)7.10.20 InfiniQuant7.10.21 IonQ7.10.22 ISARA7.10.23 KETS Quantum Security7.10.24 Magiq7.10.25 MDR Corporation7.10.26 Nordic Quantum Computing Group (NQCG)7.10.27 Oxford Quantum Circuits7.10.28 Post-Quantum (PQ Solutions)7.10.29 ProteinQure7.10.30 PsiQuantum7.10.31 Q&I7.10.32 Qasky7.10.33 QbitLogic7.10.34 Q-Ctrl7.10.35 Qilimanjaro Quantum Hub7.10.36 Qindom7.10.37 Qnami7.10.38 QSpice Labs7.10.39 Qu & Co7.10.40 Quandela7.10.41 Quantika7.10.42 Quantum Benchmark Inc.7.10.43 Quantum Circuits Inc. (QCI)7.10.44 Quantum Factory GmbH7.10.45 QuantumCTek7.10.46 Quantum Motion Technologies7.10.47 QuantumX7.10.48 Qubitekk7.10.49 Qubitera LLC7.10.50 Quintessence Labs7.10.51 Qulab7.10.52 Qunnect7.10.53 QuNu Labs7.10.54 River Lane Research (RLR)7.10.55 SeeQC7.10.56 Silicon Quantum Computing7.10.57 Sparrow Quantum7.10.58 Strangeworks7.10.59 Tokyo Quantum Computing (TQC)7.10.60 TundraSystems Global Ltd.7.10.61 Turing7.10.62 Xanadu7.10.63 Zapata Computing7.10.64 Accenture7.10.65 Atos Quantum7.10.66 Baidu7.10.67 Northrop Grumman7.10.68 Quantum Computing Inc.7.10.69 Keysight Technologies7.10.70 Nano-Meta Technologies7.10.71 Optalysys Ltd.

8.0 Quantum Computing Market Analysis and Forecasts 2021 - 20268.1.1 Quantum Computing Market by Infrastructure8.1.1.1 Quantum Computing Market by Hardware Type8.1.1.2 Quantum Computing Market by Application Software Type8.1.1.3 Quantum Computing Market by Service Type8.1.1.3.1 Quantum Computing Market by Professional Service Type8.1.2 Quantum Computing Market by Technology Segment8.1.3 Quantum Computing Market by Industry Vertical8.1.4 Quantum Computing Market by Region8.1.4.1 North America Quantum Computing Market by Infrastructure, Technology, Industry Vertical, and Country8.1.4.2 European Quantum Computing Market by Infrastructure, Technology, and Industry Vertical8.1.4.3 Asia-Pacific Quantum Computing Market by Infrastructure, Technology, and Industry Vertical8.1.4.4 Middle East & Africa Quantum Computing Market by Infrastructure, Technology, and Industry Vertical8.1.4.5 Latin America Quantum Computing Market by Infrastructure, Technology, and Industry Vertical

9.0 Conclusions and Recommendations

10.0 Appendix: Quantum Computing and Classical HPC10.1 Next Generation Computing10.2 Quantum Computing vs. Classical High-Performance Computing10.3 Artificial Intelligence in High Performance Computing10.4 Quantum Technology Market in Exascale Computing

For more information about this report visit https://www.researchandmarkets.com/r/omefq7

See the rest here:
The Worldwide Quantum Computing Industry will Exceed $7.1 Billion by 2026 - GlobeNewswire

3 tech trends that COVID-19 will accelerate in 2021 – VentureBeat

Spending 2020 under the shadow of a pandemic has affected what we need and expect from technology. For many, COVID-19 accelerated the rate of digital transformation: as employees worked from home, companies needed AI systems that facilitated remote work and the computing power to support them.

The question is, how should companies focus their resources in 2021 to prepare for this changed reality and the new technologies on the horizon? Here are three trends that I predict will see massive attention in 2021 and beyond.

Progress in AI has already reached a point where it can add significant value to practically any business. COVID-19 triggered a massive sense of urgency around digital transformations with the need for remote solutions. According to a report by Boston Consulting Group, more than 80% of companies plan to accelerate their digital transformation, but only 30% of digital transformations have met or exceeded their target value.

Many AI projects are small scale less than a quarter of companies in McKinseys 2020 State of AI reported significant bottom-line impact. This is especially true in industries that have a physical-digital element. For example: There is a great need for remotely operated, autonomous manufacturing facilities, refineries, or even, in the days of COVID-19, office buildings. While the underlying technology is there, achieving scalability remains a concern and digital leaders will have to overcome that barrier in 2021. Scalability barriers include a lack of disciplined approach, enterprise-wide mindset, credible partners, data liquidity, and change management.

Part of the solution here is to create solutions that will be operated by someone who is not necessarily a data scientist, so more people who are domain experts can manage the programs they need. If Tesla invented an autonomous car that only data scientists can drive, whats the point?

Technology needs to empower the end user so they can interact with and manipulate models without having to trudge through the finer points of datasets or code in other words, the AI will do the heavy lifting on the back end, but a user-friendly explanation and UI empowers the end user. For instance, a facilities management executive can manage their global portfolio of buildings from a tablet sitting at a Starbucks. They can have full visibility into operations, occupant experience, and spend, with the ability to intervene in what otherwise would be an autonomous operation.

Deep learning pioneer Dr. Geoffrey Hinton recently told MIT Technology Review that deep learning will be able to do everything i.e. replicate all human intelligence. Deep neural networks have demonstrated extraordinary capabilities to approximate the most relevant subset of mathematical functions and promise to overcome reasoning challenges.

However, I believe there is a step to full autonomy that we must first conquer: what Dr. Manuela Veloso at Carnegie Mellon calls symbiotic autonomy. With symbiotic autonomy, feedback and correction mechanisms are incorporated into the AI such that humans and machines pass information to each other fluidly.

For example, instead of hard feedback (like thumbs up and thumbs down powering your Netflix queue), symbiotic autonomy could look like a discussion with your phones virtual assistant to determine the best route to a destination. Interactions with these forms of AI would be more natural and conversational, with the program able to explain why it recommended or performed certain actions.

With deep learning, neural networks approximate complex mathematical functions with simpler ones, and the ability to consider a growing number of factors and make smarter decisions with fewer computing resources gives them the ability to become autonomous. I anticipate heavy investment in research of these abilities of deep neural networks across the board, from startups to top tech companies to universities.

This step toward fully autonomous solutions will be a critical step towards implementing AI at scale. Imagine an enterprise performance management system that can give you a single pane of visibility and control across a global enterprise that is operating multiple facilities, workers, and supply chains autonomously. It runs and learns on its own but you can intervene and teach when it makes a mistake.

(The question of ethics in autonomous systems will come into play here, but that is a subject for another article.)

Quantum computers have the computational power to handle complex algorithms due to their abilities to process solutions in parallel, rather than sequentially. Lets think of how this could affect development and delivery of vaccines.

First, during drug discovery, researchers must simulate a new molecule. This is tremendously challenging to do with todays high-performance computers, but is a problem that lends itself to something at which quantum computers will eventually excel. The quantum computer could eventually be mapped to the quantum system that is the molecule, and simulate binding energies and chemical transition strengths before anyone ever even had to make a drug.

However, AI and quantum computing have even more to offer beyond creating the vaccine. The logistics of manufacturing and delivering the vaccine are massive computational challenges which of course makes them ripe for a solution that combines quantum computing and AI.

Quantum machine learning is an extremely new field with so much promise, but breakthroughs are needed to make it catch investors attention. Tech visionaries can already start to see how its going to impact our future, especially with respect to understanding nanoparticles, creating new materials through molecular and atomic maps, and glimpsing the deeper makeup of the human body.

The area of growth I am most excited about is the intersection of research in these systems, which I believe will start to combine and produce results more than the sum of their parts. While there have been some connections of AI and quantum computing, or 5G and AI, all of these technologies working together can produce exponential results.

Im particularly excited to see how AI, quantum, and other tech will influence biotechnology as that might be the secret to superhuman capabilities and what could be more exciting than that?

Usman Shuja is General Manager at Honeywell.

Link:
3 tech trends that COVID-19 will accelerate in 2021 - VentureBeat

Securing the DNS in a Post-Quantum World: Hash-Based Signatures and Synthesized Zone Signing Keys – CircleID

This is the fifth in a multi-part series on cryptography and the Domain Name System (DNS).

In my last article, I described efforts underway to standardize new cryptographic algorithms that are designed to be less vulnerable to potential future advances in quantum computing. I also reviewed operational challenges to be considered when adding new algorithms to the DNS Security Extensions (DNSSEC).

In this post, I'll look at hash-based signatures, a family of post-quantum algorithms that could be a good match for DNSSEC from the perspective of infrastructure stability.

I'll also describe Verisign Labs research into a new concept called synthesized zone signing keys that could mitigate the impact of the large signature size for hash-based signatures, while still maintaining this family's protections against quantum computing.

(Caveat: The concepts reviewed in this post are part of Verisign's long-term research program and do not necessarily represent Verisign's plans or positions on new products or services. Concepts developed in our research program may be subject to U.S. and/or international patents and/or patent applications.)

The DNS community's root key signing key (KSK) rollover illustrates how complicated a change to DNSSEC infrastructure can be. Although successfully accomplished, this change was delayed by ICANN to ensure that enough resolvers had the public key required to validate signatures generated with the new root KSK private key.

Now imagine the complications if the DNS community also had to ensure that enough resolvers not only had a new key but also had a brand-new algorithm.

Imagine further what might happen if a weakness in this new algorithm were to be found after it was deployed. While there are procedures for emergency key rollovers, emergency algorithm rollovers would be more complicated, and perhaps controversial as well if a clear successor algorithm were not available.

I'm not suggesting that any of the post-quantum algorithms that might be standardized by NIST will be found to have a weakness. But confidence in cryptographic algorithms can be gained and lost over many years, sometimes decades.

From the perspective of infrastructure stability, therefore, it may make sense for DNSSEC to have a backup post-quantum algorithm built in from the start one for which cryptographers already have significant confidence and experience. This algorithm might not be as efficient as other candidates, but there is less of a chance that it would ever need to be changed. This means that the more efficient candidates could be deployed in DNSSEC with the confidence that they have a stable fallback. It's also important to keep in mind that the prospect of quantum computing is not the only reason system developers need to be considering new algorithms from time to time. As public-key cryptography pioneer Martin Hellman wisely cautioned, new classical (non-quantum) attacks could also emerge, whether or not a quantum computer is realized.

The 1970s were a foundational time for public-key cryptography, producing not only the RSA algorithm and the Diffie-Hellman algorithm (which also provided the basic model for elliptic curve cryptography), but also hash-based signatures, invented in 1979 by another public-key cryptography founder, Ralph Merkle.

Hash-based signatures are interesting because their security depends only on the security of an underlying hash function.

It turns out that hash functions, as a concept, hold up very well against quantum computing advances much better than currently established public-key algorithms do.

This means that Merkle's hash-based signatures, now more than 40 years old, can rightly be considered the oldest post-quantum digital signature algorithm.

If it turns out that an individual hash function doesn't hold up whether against a quantum computer or a classical computer then the hash function itself can be replaced, as cryptographers have been doing for years. That will likely be easier than changing to an entirely different post-quantum algorithm, especially one that involves very different concepts.

The conceptual stability of hash-based signatures is a reason that interoperable specifications are already being developed for variants of Merkle's original algorithm. Two approaches are described in RFC 8391, "XMSS: eXtended Merkle Signature Scheme" and RFC 8554, "Leighton-Micali Hash-Based Signatures." Another approach, SPHINCS+, is an alternate in NIST's post-quantum project.

Figure 1. Conventional DNSSEC signatures. DNS records are signed with the ZSK private key, and are thereby "chained" to the ZSK public key. The digital signatures may be hash-based signatures.

Hash-based signatures can potentially be applied to any part of the DNSSEC trust chain. For example, in Figure 1, the DNS record sets can be signed with a zone signing key (ZSK) that employs a hash-based signature algorithm.

The main challenge with hash-based signatures is that the signature size is large, on the order of tens or even hundreds of thousands of bits. This is perhaps why they haven't seen significant adoption in security protocols over the past four decades.

Verisign Labs has been exploring how to mitigate the size impact of hash-based signatures on DNSSEC, while still basing security on hash functions only in the interest of stable post-quantum protections.

One of the ideas we've come up with uses another of Merkle's foundational contributions: Merkle trees.

Merkle trees authenticate multiple records by hashing them together in a tree structure. The records are the "leaves" of the tree. Pairs of leaves are hashed together to form a branch, then pairs of branches are hashed together to form a larger branch, and so on. The hash of the largest branches is the tree's "root." (This is a data-structure root, unrelated to the DNS root.)

Each individual leaf of a Merkle tree can be authenticated by retracing the "path" from the leaf to the root. The path consists of the hashes of each of the adjacent branches encountered along the way.

Authentication paths can be much shorter than typical hash-based signatures. For instance, with a tree depth of 20 and a 256-bit hash value, the authentication path for a leaf would only be 5,120 bits long, yet a single tree could authenticate more than a million leaves.

Figure 2. DNSSEC signatures following the synthesized ZSK approach proposed here. DNS records are hashed together into a Merkle tree. The root of the Merkle tree is published as the ZSK, and the authentication path through the Merkle tree is the record's signature.

Returning to the example above, suppose that instead of signing each DNS record set with a hash-based signature, each record set were considered a leaf of a Merkle tree. Suppose further that the root of this tree were to be published as the ZSK public key (see Figure 2). The authentication path to the leaf could then serve as the record set's signature.

The validation logic at a resolver would be the same as in ordinary DNSSEC:

The only difference on the resolver's side would be that signature validation would involve retracing the authentication path to the ZSK public key, rather than a conventional signature validation operation.

The ZSK public key produced by the Merkle tree approach would be a "synthesized" public key, in that it is obtained from the records being signed. This is noteworthy from a cryptographer's perspective, because the public key wouldn't have a corresponding private key, yet the DNS records would still, in effect, be "signed by the ZSK!"

In this type of DNSSEC implementation, the Merkle tree approach only applies to the ZSK level. Hash-based signatures would still be applied at the KSK level, although their overhead would now be "amortized" across all records in the zone.

In addition, each new ZSK would need to be signed "on demand," rather than in advance, as in current operational practice.

This leads to tradeoffs, such as how many changes to accumulate before constructing and publishing a new tree. Fewer changes and the tree will be available sooner. More changes and the tree will be larger, so the per-record overhead of the signatures at the KSK level will be lower.

My last few posts have discussed cryptographic techniques that could potentially be applied to the DNS in the long term or that might not even be applied at all. In my next post, I'll return to more conventional subjects, and explain how Verisign sees cryptography fitting into the DNS today, as well as some important non-cryptographic techniques that are part of our vision for a secure, stable and resilient DNS.

Read the previous posts in this six-part blog series:

See the original post here:
Securing the DNS in a Post-Quantum World: Hash-Based Signatures and Synthesized Zone Signing Keys - CircleID

Insights on the High Performance Computing Global Market to 2026 – Featuring Amazon Web Services, Atos and Advanced Micro Devices Among Others -…

Dublin, Jan. 21, 2021 (GLOBE NEWSWIRE) -- The "High Performance Computing Market by Component, Infrastructure, Services, Price Band, HPC Applications, Deployment Types, Industry Verticals, and Regions 2021 - 2026" report has been added to ResearchAndMarkets.com's offering.

The High Performance Computing market includes computation solutions provided either by supercomputers or via parallel processing techniques such as leveraging clusters of computers to aggregate computing power. HPC is well-suited for applications that require high performance data computation and analysis such as high frequency trading, autonomous vehicles, genomics-based personalized medicine, computer-aided design, deep learning, and more. Specific examples include computational fluid dynamics, simulation, modeling, and seismic tomography.

This report evaluates the HPC market including companies, solutions, use cases, and applications. Analysis includes HPC by organizational size, software and system type, server type, and price band, and industry verticals. The report also assesses the market for integration of various artificial intelligence technologies in HPC. It also evaluates the exascale-level HPC market including analysis by component, hardware type, service type, and industry vertical.

Select Report Findings:

The market is currently dominated on the demand side by large corporations, universities, and government institutions by way of capabilities that are often used to solve very specific problems for large institutions. Examples include financial services organizations, government R&D facilities, universities research, etc.

However, the cloud-computing based "as a Service" model allows HPC market offerings to be extended via HPC-as-a-Service (HPCaaS) to a much wider range of industry verticals and companies, thereby providing computational services to solve a much broader array of problems. Industry use cases are increasingly emerging that benefit from HPC-level computing, many of which benefit from split processing between localized devices/platforms and HPCaaS.

In fact, HPCaaS is poised to become much more commonly available, partially due to new on-demand supercomputer service offerings, and in part as a result of emerging AI-based tools for engineers. Accordingly, up to 52% of revenue will be directly attributable to the cloud-based business model via HPCaaS, which makes High-Performance Computing solutions available to a much wider range of industry verticals and companies, thereby providing computational services to solve a much broader array of problems.

In a 2020 study, we conducted interviews with major players in the market as well as smaller, lesser known companies that are believed to be influential in terms of innovative solutions that are likely to drive adoption and usage of both cluster-based HPC and supercomputing. In an effort to identify growth opportunities for the HPC market, we investigated market gaps including unserved and underserved markets and submarkets. The research and advisory firm uncovered a market situation in which HPC currently suffers from an accessibility problem as well as inefficiencies and supercomputer skill gaps.

Stated differently, the market for HPC as a Service (e.g. access to high-performance computing services) currently suffers from problems related to the utilization, scheduling, and set-up time to run jobs on a supercomputer. We identified start-ups and small companies working to solve these problems.

One of the challenge areas identified is low utilization but (ironically) also high wait times for most supercomputers. Scheduling can be a challenge in terms of workload time estimation. About 23% of jobs are computationally heavy and 37% of jobs cannot be defined very well in terms of how long jobs will take (within a 3-minute window at best). In many instances, users request substantive resources and don't actually use computing time.

In addition to the scheduling challenge, we also identified a company focused on solving additional problems such as computational planning and engineering. We spoke with the principal of a little-known company called Microsurgeonbot, Inc. (doing business as MSB.ai), which is developing a tool for setting up computing jobs for supercomputers.

The company is working to solve major obstacles in accessibility and usability for HPC resources. The company focuses on solving a very important problem in HPC: Supercomputer job set-up and skills gap. Their solution known as "Guru" is poised to make supercomputing much more accessible, especially to engineers in small to medium-sized businesses that do not have the same resources or expertise as large corporate entities.

Target Audience:

Key Topics Covered:

1 Executive Summary

2 Introduction2.1 Next Generation Computing2.2 High Performance Computing2.2.1 HPC Technology2.2.2 Exascale Computation2.2.3 High Performance Technical Computing2.2.4 Market Segmentation Considerations2.2.5 Regulatory Framework2.2.6 Value Chain Analysis2.2.7 AI to Drive HPC Performance and Adoption

3 High Performance Computing Market Dynamics3.1 HPC Market Drivers3.2 HPC Market Challenges

4 High Performance Computing Market Analysis and Forecasts4.1 Global High Performance Computing Market 2021 - 20264.1.1 Total High Performance Computing Market4.1.2 High Performance Computing Market by Component4.1.3 High Performance Computing Market by Deployment Type4.1.4 High Performance Computing Market by Organization Size4.1.5 High Performance Computing Market by Server Price Band4.1.6 High Performance Computing Market by Application Type4.1.7 High Performance Computing Deployment Options: Supercomputer vs. Clustering4.1.8 High Performance Computing as a Service (HPCaaS)4.1.9 AI Powered High Performance Computing Market4.2 Regional High Performance Computing Market 2021 - 20264.2.1 High Performance Computing Market by Region4.2.2 North America High Performance Computing Market by Component, Deployment, Organization, Server Price Band, Application, Industry Vertical, and Country4.2.3 Europe High Performance Computing Market by Component, Deployment, Organization, Server Price Band, Application, Industry Vertical, and Country4.2.4 APAC High Performance Computing Market by Component, Deployment, Organization, Server Price Band, Application, Industry Vertical, and Country4.2.5 MEA High Performance Computing Market by Component, Deployment, Organization, Server Price Band, Application, Industry Vertical, and Country4.2.6 Latin America High Performance Computing Market by Component, Deployment, Organization, Server Price Band, Application, Industry Vertical, and Country4.2.7 High Performance Computing Market by Top Ten Country4.3 Exascale Computing Market 2021 - 20264.3.1 Exascale Computing Driven HPC Market by Component4.3.2 Exascale Computing Driven HPC Market by Hardware Type4.3.3 Exascale Computing Driven HPC Market by Service Type4.3.4 Exascale Computing Driven HPC Market by Industry Vertical4.3.1 Exascale Computing as a Service

5 High Performance Computing Company Analysis5.1 HPC Vendor Ecosystem5.2 Leading HPC Companies5.2.1 Amazon Web Services Inc.5.2.2 Atos SE5.2.3 Advanced Micro Devices Inc.5.2.4 Cisco Systems5.2.5 DELL Technologies Inc.5.2.6 Fujitsu Ltd5.2.7 Hewlett Packard Enterprise5.2.8 IBM Corporation5.2.9 Intel Corporation5.2.10 Microsoft Corporation5.2.11 NEC Corporation5.2.12 Nvidia5.2.13 Rackspace Inc.

6 High Performance Computing Market Use Cases6.1 Fraud Detection in the Financial Industry6.2 Healthcare and Clinical Research6.3 Manufacturing6.4 Energy Exploration and Extraction6.5 Scientific Research6.6 Electronic Design Automation6.7 Government6.8 Computer Aided Engineering6.9 Education and Research6.10 Earth Science

7 Conclusions and Recommendations

8 Appendix: Future of Computing8.1 Quantum Computing8.1.1 Quantum Computing Technology8.1.2 Quantum Computing Considerations8.1.3 Market Challenges and Opportunities8.1.4 Recent Developments8.1.5 Quantum Computing Value Chain8.1.6 Quantum Computing Applications8.1.7 Competitive Landscape8.1.8 Government Investment in Quantum Computing8.1.9 Quantum Computing Stakeholders by Country8.1.10 Other Future Computing Technologies8.1.11 Market Drivers for Future Computing Technologies8.2 Future Computing Market Challenges8.2.1 Data Security Concerns in Virtualized and Distributed Cloud8.2.2 Funding Constrains R&D Activities8.2.3 Lack of Skilled Professionals across the Sector8.2.4 Absence of Uniformity among NGC Branches including Data Format

For more information about this report visit https://www.researchandmarkets.com/r/1frr52

Original post:
Insights on the High Performance Computing Global Market to 2026 - Featuring Amazon Web Services, Atos and Advanced Micro Devices Among Others -...