Archive for the ‘Quantum Computing’ Category

Cleveland Clinic Selected as Founding Partner in Greater Washington, D.C., Quantum Computing Hub – Cleveland Clinic Newsroom

Cleveland Clinic has been selected as a founding partner and the leading healthcare system in a new initiative meant to spur collaboration and innovation in the quantum computing industry.

Based in Greater Washington D.C., Connected DMV and a cross-sector coalition of partners are developing the new Life Sciences and Healthcare Quantum Innovation Hub to prepare the industry for the burgeoning quantum era and align with key national and global efforts in life sciences and quantum technologies.

The U.S. Department of Commerces Economic Development Administration (EDA) has awarded more than $600,000 to Connected DMV for development of the Hub. This will include the formation of a collaboration of at least 25 organizations specializing in quantum end-use and technology build.

Cleveland Clinic was invited to join the Hub because of its work in advancing medical research through quantum computing. As the lead healthcare system in the coalition, Cleveland Clinic will help define quantums role in the future of healthcare and disseminate education to other health systems on its possibilities.

We believe quantum computing holds great promise for accelerating the pace of scientific discovery, said Lara Jehi, M.D., M.H.C.D.S., Cleveland Clinics Chief Research Information Officer. As an academic medical center, research, innovation and education are an integral part of Cleveland Clinics mission. Quantum, AI and other emerging technologies have the potential to revolutionize medicine, and we look forward to working with partners across healthcare and life sciences to solve complex medical problems and change the course of diseases like cancer, heart conditions and neurodegenerative disorders.

Last year, Cleveland Clinic announced a 10-year partnership with IBM to establish the Discovery Accelerator, a joint center focused on easing traditional bottlenecks in medical research through innovative technologies such as quantum computing, hybrid cloud and artificial intelligence. The partnership leverages Cleveland Clinics medical expertise with the technology expertise of IBM including its leadership in quantum technology which recently resulted in the Breakthrough Award in Fundamental Physics for quantum information science. The Discovery Accelerator will allow Cleveland Clinic to contribute to Connected DMVs Hub by advancing the pace of discovery with the first private sector on-premises Quantum System One being installed on Cleveland Clinics main campus.

Innovation is always iterative, and requires sustained collaboration between research, development and technology, and the industries that will benefit from the value generated, said George Thomas, Chief Innovation Officer of Connected DMV and lead of its Potomac Quantum Innovation Center initiative. Quantum has the potential to have a substantive impact on our society in the near future, and the Life Sciences and Healthcare Quantum Innovation Hub will serve as the foundation for sustained focus and investment to accelerate and scale our path into the era of quantum.

The Hub will be part of Connected DMVs Potomac Quantum Innovation Center initiative, which aims to: accelerate quantum investment, and research and development; develop an equitable and scalable talent pipeline; and scale collaboration between the public sector, academia, industry, community, and investors to accelerate the value of quantum. The Quantum Innovation Hubs are a part of this initiative to focus on accelerating quantum investment, research and development in key industry sectors.

Original post:
Cleveland Clinic Selected as Founding Partner in Greater Washington, D.C., Quantum Computing Hub - Cleveland Clinic Newsroom

PsiQuantum Has A Goal For Its Million Qubit Photonic Quantum Computer To Outperform Every Supercomputer On The Planet – Forbes

PsiQuantum

In 2009, Jeremy O'Brien, a professor at the University of Bristol, published a research paper describing how to repurpose on-chip optical components originally developed by the telecom industry to manipulate single particles of light and perform quantum operations.

By 2016, based on the earlier photonic research, OBrien and three of his academic colleagues, Terry Rudolph, Mark Thompson, and Pete Shadbolt, created PsiQuantum.

The founders all believed that the traditional method of building a quantum computer of a useful size would take too long. At the companys inception, the PsiQuantum team established its goal to build a million qubit, fault-tolerant photonic quantum computer. They also believed the only way to create such a machine was to manufacture it in a semiconductor foundry.

Early alerts

PsiQuantum first popped up on my quantum radar about two years ago when it received $150 million in Series C funding which upped total investments in the company to $215 million.

That level of funding meant there was serious interest in the potential of whatever quantum device PsiQuantum was building. At that time, PsiQuantum was operating in a stealth mode, so there was little information available about its research.

Finally, after receiving another $450 million in Series D funding last year, PsiQuantum disclosed additional information about its technology. As recently as few weeks ago, a small $25 million US government grant was awarded jointly to PsiQuantum and its fabrication partner, GlobalFoundries, for tooling and further development of its photonic quantum computer. Having GlobalFoundries as a partner was a definite quality signal. GF is a high-quality, premiere fab and only one of the three tier one foundries worldwide.

With a current valuation of $3.15 Billion, PsiQuantum is following a quantum roadmap mainly paved with stepping stones of its own design with unique technology, components, and processes needed to build a million-qubit general-purpose silicon photonic quantum computer.

Technology

Classical computers encode information using digital bits to represent a zero or a one. Quantum computers use quantum bits (qubits), which can also represent a one or a zero, or be in a quantum superposition of some number between zero and one at the same time. There are a variety of qubit technologies. IBM, Google, and Rigetti use qubits made with small loops of wire that become superconductors when subjected to very cold temperatures. Quantinuum and IonQ use qubits formed by removing an outer valence electron from an atom of Ytterbium to create an ion. Atom Computing makes neutral atom spin qubits using an isotope of Strontium.

Light is used for various operations in superconductors and atomic quantum computers. PsiQuantum also uses light and turns infinitesimally small photons of light into qubits. Of the two types of photonic qubits - squeezed light and single photons - PsiQuantums technology of choice is single-photon qubits.

Using photons as qubits is a complex process. It is complicated to determine the quantum state of a single photon among trillions of photons with a range of varied frequencies and energies.

Dr. Pete Shadbolt is the Co-founder and Chief Science Officer of PsiQuantum. His responsibilities include overseeing the application and implementation of technology and scientific-related policies and procedures that are vital to the success of PsiQuantum. After earning his PhD in experimental photonic quantum computing from the University of Bristol in 2014, he was a postdoc at Imperial College researching the theory of photonic quantum computing. While at Bristol, he demonstrated the first-ever Variational Quantum Eigensolver and the first-ever public API to a quantum processor. He has been awarded the 2014 EPSRC "Rising Star" by the British Research Council; the EPSRC Recognizing Inspirational Scientists and Engineers Award; and the European Physics Society Thesis Prize.

Dr. Shadbolt explained that detecting a single photon from a light beam is analogous to collecting a single specified drop of water from the Amazon river's volume at its widest point.

That process is occurring on a chip the size of a quarter, Dr. Shadbolt said. Extraordinary engineering and physics are happening inside PsiQuantum chips. We are constantly improving the chips fidelity and single photon source performance.

Just any photon isnt good enough. There are stringent requirements for photons used as qubits. Consistency and fidelity are critical to the performance of photonic quantum computers. Therefore, each photon source must have high purity, proper brightness, and generate consistently identical photons.

The right partner

GlobalFoundries facility in Essex, Vermont

When PsiQuantum announced its Series D funding a year ago, the company revealed it had formed a previously undisclosed partnership with GlobalFoundries. Out of public view, the partnership had been able to build a first-of-its-kind manufacturing process for photonic quantum chips. This manufacturing process produces 300-millimeter wafers containing thousands of single photon sources, and a corresponding number of single photon detectors. The wafer also contains interferometers, splitters, and phase shifters. In order to control the photonic chip, advanced electronic CMOS control chips with around 750 million transistors were also built at the GlobalFoundries facility in Dresden, Germany.

Photon advantages

Every quantum qubit technology has its own set of advantages and disadvantages. PsiQuantum chose to use photons to build its quantum computer for several reasons:

Another major advantage of photon qubits worth highlighting is the ability to maintain quantum states for a relatively long time. As an example of lights coherence, despite traveling for billions of years, light emitted by distant stars and galaxies reaches earth with its original polarization intact. The longer a qubit can maintain its polarized quantum state, the more quantum operations it can perform, which makes the quantum computer more powerful.

Why start with a million qubits?

We believed we had cracked the code for building a million-qubit quantum computer, Dr. Shadbolt said. Even though that's a huge number, the secret seemed simple. All we had to do was use the same process as the one being used to put billions of transistors into cell phones. We felt a large quantum computer wouldnt exist in our lifetime unless we figured out how to build it in a semiconductor foundry. That idea has been turned into reality. We are now building quantum chips next to laptops and cell phone chips on the GlobalFoundries 300-millimeter platform.

According to Dr. Shadbolt, PsiQuantums custom fabrication line has made much progress. Surprisingly, building a million-qubit quantum machine in a foundry has many of the same non-quantum issues as assembling a classical supercomputer, including chip yields, reliability, high-throughput testing, packaging, and cooling albeit to cryogenic temperatures.

From the time that our first GlobalFoundries announcement was made until now, we've produced huge amounts of silicon, Dr. Shadbolt said. Weve done seven tapeouts in total and were now seeing hundreds and hundreds of wafers of silicon coming through our door. We are investing heavily in packaging, assembly systems, integration, and fiber attachment to ensure the highest efficiency of light flowing in and out of the chip.

PsiQuantum is performing a great deal of ongoing research as well as continually improving the performance of photonic components and processes. In addition to high-performance optical components, the technologies that enable the process are also very important. A few enablers include optical switches, fiber-to-chip interconnects, and bonding methods.

We have greatly improved the efficiency of our photon detectors over the last few tapeouts at GlobalFoundries, Dr. Shadbolt explained. Were constantly working to prevent fewer and fewer photons from being lost from the system. We also have driven waveguide losses to extremely low levels in our recent chips.

There is much innovation involved. Our light source for single photons is a good example. We shine laser light directly into the chip to run the single photon sources. The laser is about a trillion times more intense than the single photons we need to detect, so we must attenuate light on that chip by a factor of about a trillion.

Dr. Shadbolt attributes PsiQuantums manufacturing success to GlobalFoundries. From experience, he knows there is a significant difference between a second-tier foundry and a first-tier foundry like GlobalFoundries. Building chips needed by PsiQuantum can only be built with an extremely mature manufacturing process.

PsiQuantum has two demanding requirements. We need a huge number of components, and we need those components to consistently meet extremely demanding performance requirements. There are very few partners in the world who can reliably achieve something like this, and we always knew that partnering with a mature manufacturer like GlobalFoundries would be key to our strategy.

The partnership has also been beneficial for GlobalFoundries because it has gained additional experience with new technologies by adding PsiQuantums photonic processes to the foundry.

The end is in sight

According to Dr. Shadbolt, the original question of whether large numbers of quantum devices could be built in a foundry is no longer an issue as routinely demonstrated by its output of silicon. However, inserting new devices into the manufacturing flow has always been difficult. It is slow and it is very expensive. Nanowire single photon detectors are an example of a development that came directly from the university lab and was inserted into the manufacturing process.

PsiQuantums semiconductor roadmap only has a few remaining items to complete. Since a million qubits wont fit on a single chip, the quantum computer will require multiple quantum processor chips to be interconnected with optical fibers and facilitated by ultra-high-performance optical switches to allow teleportation and entanglement of single photon operations between chips.

What remains is the optical switch, Dr. Shadbolt said. You might ask why photonic quantum computing people have never built anything at scale? Or why they havent demonstrated very large entangled states? The reason is that a special optical switch is needed, but none exists. It must have very high performance, better than any existing state-of-the-art optical switch such as those used for telecom networking. Its a classical device, and its only function will be to route light between waveguides, but it must be done with extremely low loss and at very high speed. It must be a really, really good optical switch.

If it cant be bought, then it must be built

Implementing an optical switch with the right specs is a success-or-fail item for PsiQuantum. Since a commercial optical switch doesnt exist that fits the application needs, PsiQuantum was left with no choice but to build one. For the past few years, its management has been heavily investing in developing a very high-performance optical switch.

Dr. Shadbolt explained: I believe this is one of the most exciting things PsiQuantum is doing. Building an extremely high-performance optical switch is the next biggest thing on our roadmap. We believe it is the key to unlocking the huge promise of optical quantum computing.

Summary

PsiQuantum was founded on the belief that photonics was the right technology for building a fault tolerant quantum machine with a million qubits and that the proper approach was based on semiconductor manufacturing. In contrast to NISQ quantum computers, the founders wanted to avoid building incrementally larger and larger machines over time.

Considering the overall process needed to build a million-qubit quantum computer, its high degree of complexity, and the lack of proven tools and processes to do it with, PsiQuantum has made amazing progress since it first formed the company.

It established a true partnership with one of the best foundries in the world and produced seven tapeouts and funded a half dozen new tools to build a first-of-its-kind wafer manufacturing process, incorporating superconducting single photon detectors into a regular silicon-photonic chip.

And today, it is answering yet another challenge by building an optical switch to fill a void where the needed product doesnt exist.

It is no surprise that an ultra- high-performance optical switch is a key part of PsiQuantums plans to build a scalable million qubit quantum computer. Other quantum companies are also planning to integrate similar optical switching technology to scale modular QPU architectures within the decade. The high-performance optical switch PsiQuantum is developing could someday connect tens of thousands of quantum processing units in a future multi-million qubit quantum data center. As a standalone product, it could also be a source of additional revenue should PsiQuantum choose to market it.

Once the optical switch has been built, it will then need to be enabled into GlobalFoundries manufacturing flow. That is the last step needed to complete PsiQuantums foundry assembly process and then it will be ready to produce photonic quantum computer chips.

But even with a complete end-to-end manufacturing process, significantly more time will be needed to construct a full-blown fault-tolerant quantum computer. It will remain for PsiQuantum to build complete quantum computers around chips produced by GlobalFoundries. For that, it will need a trained workforce and a location and infrastructure where large qubit photonic quantum computers can be assembled, integrated, tested, and distributed.

Based on the amount of post-foundry work, development of the optical switch, and assembly that remains, and assuming no major technology problems or delays occur, I believe it will be after mid-decade before a photonic quantum computer of any scale can be offered by PsiQuantum.

Ill wrap this up with comments made by Dr. Shadbolt during our discussion about the optical switch. I believe it demonstrates why PsiQuantum has been, and will continue to be successful:

Even though the optical switch will obviously be a very powerful generic technology of interest to others, we are not interested in its generic usefulness. We are only interested in the fact that it will allow us to build a quantum computer that outperforms every supercomputer on the planet. That is our singular goal.

Paul Smith-Goodson is Vice President and Principal Analyst for quantum computing, artificial intelligence and space at Moor Insights and Strategy. You can follow him on Twitter for more current information on quantum, AI, and space.

Note: Moor Insights & Strategy writers and editors may have contributed to this article.

Moor Insights & Strategy, like all research and tech industry analyst firms, provides or has provided paid services to technology companies. These services include research, analysis, advising, consulting, benchmarking, acquisition matchmaking, and speaking sponsorships. The company has had or currently has paid business relationships with 88, Accenture, A10 Networks, Advanced Micro Devices, Amazon, Amazon Web Services, Ambient Scientific, Anuta Networks, Applied Brain Research, Applied Micro, Apstra, Arm, Aruba Networks (now HPE), Atom Computing, AT&T, Aura, Automation Anywhere, AWS, A-10 Strategies, Bitfusion, Blaize, Box, Broadcom, C3.AI, Calix, Campfire, Cisco Systems, Clear Software, Cloudera, Clumio, Cognitive Systems, CompuCom, Cradlepoint, CyberArk, Dell, Dell EMC, Dell Technologies, Diablo Technologies, Dialogue Group, Digital Optics, Dreamium Labs, D-Wave, Echelon, Ericsson, Extreme Networks, Five9, Flex, Foundries.io, Foxconn, Frame (now VMware), Fujitsu, Gen Z Consortium, Glue Networks, GlobalFoundries, Revolve (now Google), Google Cloud, Graphcore, Groq, Hiregenics, Hotwire Global, HP Inc., Hewlett Packard Enterprise, Honeywell, Huawei Technologies, IBM, Infinidat, Infosys, Inseego, IonQ, IonVR, Inseego, Infosys, Infiot, Intel, Interdigital, Jabil Circuit, Keysight, Konica Minolta, Lattice Semiconductor, Lenovo, Linux Foundation, Lightbits Labs, LogicMonitor, Luminar, MapBox, Marvell Technology, Mavenir, Marseille Inc, Mayfair Equity, Meraki (Cisco), Merck KGaA, Mesophere, Micron Technology, Microsoft, MiTEL, Mojo Networks, MongoDB, MulteFire Alliance, National Instruments, Neat, NetApp, Nightwatch, NOKIA (Alcatel-Lucent), Nortek, Novumind, NVIDIA, Nutanix, Nuvia (now Qualcomm), onsemi, ONUG, OpenStack Foundation, Oracle, Palo Alto Networks, Panasas, Peraso, Pexip, Pixelworks, Plume Design, PlusAI, Poly (formerly Plantronics), Portworx, Pure Storage, Qualcomm, Quantinuum, Rackspace, Rambus, Rayvolt E-Bikes, Red Hat, Renesas, Residio, Samsung Electronics, Samsung Semi, SAP, SAS, Scale Computing, Schneider Electric, SiFive, Silver Peak (now Aruba-HPE), SkyWorks, SONY Optical Storage, Splunk, Springpath (now Cisco), Spirent, Splunk, Sprint (now T-Mobile), Stratus Technologies, Symantec, Synaptics, Syniverse, Synopsys, Tanium, Telesign,TE Connectivity, TensTorrent, Tobii Technology, Teradata,T-Mobile, Treasure Data, Twitter, Unity Technologies, UiPath, Verizon Communications, VAST Data, Ventana Micro Systems, Vidyo, VMware, Wave Computing, Wellsmith, Xilinx, Zayo, Zebra, Zededa, Zendesk, Zoho, Zoom, and Zscaler. Moor Insights & Strategy founder, CEO, and Chief Analyst Patrick Moorhead is an investor in dMY Technology Group Inc. VI, Dreamium Labs, Groq, Luminar Technologies, MemryX, and Movandi.

Moor Insights & Strategy founder, CEO, and Chief Analyst Patrick Moorhead is an investor in dMY Technology Group Inc. VI, Dreamium Labs, Groq, Luminar Technologies, MemryX, and Movand

Here is the original post:
PsiQuantum Has A Goal For Its Million Qubit Photonic Quantum Computer To Outperform Every Supercomputer On The Planet - Forbes

ST Explains: How will quantum computing contribute to vaccine, EV development? – The Straits Times

SINGAPORE - Singapore is stepping up its investments in quantum computing.

Chiefly, it will have a foundry to develop the components and materials needed to build quantum computers to establish an ecosystem of activities in the emerging field.

Singapore will also join a handful of nations - the United States, China, France, Finland, Germany, South Korea and Japan - in building its own quantum computer to gain first-hand experience with the technology.

The Straits Times explains what quantum computing is, and the benefits the technology brings.

It is similar to traditional computing but operating at the far cooler temperature of nearly absolute zero, the temperature at which a thermodynamic system has the lowest energy corresponding to minus 273.15 deg C.

Under layers of casing and cryogenic components to attain this super cool state - colder than in outer space - quantum objects (an electron or a particle of light) are manipulated to execute complex mathematical calculations out of reach of traditional computers.

Traditional computers store information as either 0s or 1s. Quantum computers, on the other hand, use quantum bits (or qubits) to represent and store information in a complex mix of 0s and 1s simultaneously. As the number of qubits grows, a quantum computer becomes exponentially more powerful.

Quantum computing's long development history dates back to the 1970s, when the late American physicist Paul Anthony Benioff demonstrated the theoretical possibility of quantum computers.

By harnessing quantum physics, quantum computing has the potential to comb vast numbers of possibilities in hours and pinpoint a probable solution. It would take a traditional computer hundreds of thousands of years to perform a similar task.

Japan's first prototype quantum computer, unveiled in 2017, could make complex calculations 100 times faster than a conventional supercomputer.

Google's quantum computer created in 2019 could perform in 200 seconds a computation that would take the world's fastest supercomputers about 10,000 years.

A year later, in 2020, a team at the University of Science and Technology of China assembled a quantum computer that could perform in 200 seconds a calculation that an ordinary supercomputer would have taken 2.5 billion years to complete.

But none of these machines was given practical tasks.

See more here:
ST Explains: How will quantum computing contribute to vaccine, EV development? - The Straits Times

Quantum Computing Market Growth Status, Business Prospects, and Forecast 2020-2025 The Colby Echo News – The Colby Echo News

Request To Download Sample of This Strategic Report: https://www.astuteanalytica.com/request-sample/quantum-computing-market

The study covers a detailed segmentation of the quantum computing market, along with country analysis, key information, and a competitive outlook. The report mentions the company profiles of key players that are currently dominating the quantum computing market, wherein various development, expansion, and winning strategies practiced and executed by leading players have been presented in detail.

Key Questions Answered in this Report on Quantum Computing Market

The report provides detailed information about the quantum computing market on the basis of comprehensive research on various factors that play a key role in accelerating the growth potential of the market. Information mentioned in the report answers path-breaking questions for companies that are currently functioning in the market and are looking for innovative ways to create a unique benchmark in the quantum computing market, so as to help them make successful strategies and take target-driven decisions.

Download Sample Report, SPECIAL OFFER (Avail an Up-to 30% discount on this report-https://www.astuteanalytica.com/request-sample/quantum-computing-market

Research Methodology Quantum Computing Market

The research methodology adopted by analysts to compile the quantum computing market report is based on detailed primary as well as secondary research. With the help of in-depth insights of the industry-affiliated information that is obtained and legitimated by market-admissible sources, analysts have offered riveting observations and authentic forecasts of the quantum computing market.

During the primary research phase, analysts interviewed industry stakeholders, investors, brand managers, vice presidents, and sales and marketing managers. On the basis of data obtained through the interviews of genuine sources, analysts have emphasized the changing scenario of the quantum computing market.

For secondary research, analysts scrutinized numerous annual report publications, white papers, industry association publications, and company websites to obtain the necessary understanding of the quantum computing market.

Request Full Report-https://www.astuteanalytica.com/request-sample/quantum-computing-market

About Astute Analytica:

Astute Analytica is a global analytics and advisory company that has built a solid reputation in a short period, thanks to the tangible outcomes we have delivered to our clients. We pride ourselves in generating unparalleled, in-depth, and uncannily accurate estimates and projections for our very demanding clients spread across different verticals. We have a long list of satisfied and repeat clients from a wide spectrum including technology, healthcare, chemicals, semiconductors, FMCG, and many more. These happy customers come to us from all across the globe.

They are able to make well-calibrated decisions and leverage highly lucrative opportunities while surmounting the fierce challenges all because we analyze for them the complex business environment, segment-wise existing and emerging possibilities, technology formations, growth estimates, and even the strategic choices available. In short, a complete package. All this is possible because we have a highly qualified, competent, and experienced team of professionals comprising business analysts, economists, consultants, and technology experts. In our list of priorities, you-our patron-come at the top. You can be sure of the best cost-effective, value-added package from us, should you decide to engage with us.

Get in touch with us:

Phone number:+18884296757

Email:sales@astuteanalytica.com

Visit our website:https://www.astuteanalytica.com/

Visit link:
Quantum Computing Market Growth Status, Business Prospects, and Forecast 2020-2025 The Colby Echo News - The Colby Echo News

Premises and potentates for cloud in 2022 ERP Today – ERP Today

Quantum computing,AI, DAOs and the metaverse make for an interestingcloud race after COVID-19

Two years ago, when I last wrote about the state of the cloud race for ERP Today, the landscape was characterised by the amount of capital that enterprises needed to save, which led to more workloads being driven to the cloud. Converting CAPEX into OPEX has proved to be a winning strategy for enterprises, enabling them to drive enterprise acceleration.

But since then, we have seen the era of infinite computing begin, meaning enterprises now need to build next generation applications to stay in the game. Meanwhile the war in Ukraine, COVID-19, rising interest rates and fears of an upcoming recession havent increased the appetite of boards to put capital into data centres, not when the cloud offers a workable, viable and in almost all cases, a safe opportunity for enterprises to operate workloads in the 21st century.

Lets look at the state of play as reached in the last two years, and the next-gen disruptions that are acting as game changers in cloud.

COVID-19 boosted the cloud

Uncertainty is a strong impetus to review the status quo. As many an enterprise struggled to both redefine itself and operate successfully under pandemic circumstances, the ability to tie operating costs to business performance has proved to be both highly desirable and critical. Moreover, board members who took a more cavalier approach towards the IT conversation on moving to the cloud have become keen experts on CAPEX allocation in the last few years, and in many cases had their eureka moment when it came to IT costs.

The results have driven a harder push to move workloads to the cloud than ever before. Not only did COVID-19 boost (pun intended) the move to the cloud, but also the Great Re-Assessment of employees towards working life. When people work from home, switch jobs more often and redefine their work/life balance, it becomes even more of a challenge to fill IT operations jobs to run on-premise data centres. Here is what CxOs are doing or should be doing to move to the cloud in these changing times:

Talk to your apps vendor. It isnt only enterprises that have to move to the cloud, but also the application vendors as well. Obviously they will be keen to keep their customers, and will offer ways to move an enterprise to the cloud.

CxOs should evaluate these offerings and consider taking advantage of them if they see their enterprise working long-term with its current business application vendor.

Select SaaS vendors that use IaaS vendors. As I laid out in my article two years ago, it matters greatly when it comes to who is responsible for producing CAPEX. It should not be an enterprises future SaaS vendor, as they would be better off investing in their software rather than infrastructure.

CxOs therefore need to ask their SaaS vendor if they run on another IaaS vendor, because if they do not, then they cant offer commercial elasticity. Which is what the move to the cloud is all about: pay less when you use less, and only pay more for IT when you use more IT.

Start building in the cloud. Software is eating the world, and enterprises must create next generation applications to differentiate themselves and operate new digital business models.

Suffice to say that CxOs need to build these applications in the cloud, with modern tools and with cloud-inherent economic mechanics as this is the only way to avoid creating bigger CAPEX challenges and higher migration loads to the cloud in the long run.

The cloud is essential to fuel the move to AI as it enables infinite insights, the economical data storage of all things digital in the enterprise

The AI imperative forces the enterprise into the cloud

A lot has been said about the impact of artificial intelligence (AI) on the enterprise. The long wait and anticipation phase is ending, and the benefits are becoming real so real that enterprises which dont take advantage of AI will struggle to remain relevant towards the end of this decade.

The cloud is essential to fuel the move to AI as it enables infinite insights, the unlimited, economical data storage of all things digital in the enterprise, while not knowing what the queries to the data will be, in short fuelled by Hadoop data-scalable technologies.

AI also fosters infinite compute, the ability to ramp up and ramp down computing infrastructure to fuel AI processes in the volume as enterprises need it.

Quantum computing is becoming increasingly relevant

The next generation of computing paradigm that will be relevant for the enterprise is quantum computing. The technology has matured fast from a pure speeds and feeds phase only two years ago, to the first commercial use cases being available since spring 2022.

Quantum computing will be the first computing platform that enterprises will not adopt on-premise (except for some government use cases, of course, and perhaps deep-pocketed banks and pharma enterprises) but from the cloud. CxOs who want their enterprise to be able to take advantage of quantum computing better move their enterprise to the cloud, starting with data.

Deep learning is deep loomingon the horizon

The ability of software to learn from data and then determine and even automate the right course of action is deep learning. Deep learning will be key for enterprises that compete to give their employees an attractive workplace with a compelling work/life balance, and the automation for that can only come from deep learning.

To be ready for deep learning, CxOs need to move workloads and data to the cloud so their enterprise can take advantage of deep learning capabilities and thrive in the marketplace with both its employees and customers.

DAOs are powered by the cloud

Decentral autonomous organisations (DAOs) operate in the cloud, as they need to minimise CAPEX and need both the architectural and commercial economics only the cloud offers. And while the decentralised approach seems alien to the traditional enterprise, innovative CxOs will make sure that their enterprise can take advantage of both the talent and capital currently flowing into DAOs.

As the decentralisation trend will only get stronger, it is even more relevant to have both an enterprises data and processes in the cloud so that it can take advantage of DAO dynamics powered on the blockchain.

The metaverse will transformall business

The metaverse merits a whole article by itself, and while we know very little about it, with it likely to be the last of these mega trends to materialise, we know one thing for sure already: the metaverse runs in the cloud, so to hedge and be ready whenever the metaverse fully lands, CxOs need to make sure their enterprise runs in the cloud.

Quantum computing has matured fast from a pure speeds and feeds phase, to the first commercial use cases being available since spring 2022

Handicapping the Big Three and a not-so-young newcomer

The rich got richer in the cloud these last few years. As such, Amazons AWS, Googles Google Cloud and Microsofts Azure businesses are bigger and more relevant than ever before. The newcomer to the game is Oracles Oracle Cloud, which has earned its spot at the table by building one of the most enterprise-friendly public clouds out there.

As you already know, when understanding technology vendors, it is always good to look at their organisational DNA alongside pure capabilities:

Amazon AWS leads and keeps leading. Nothing has changed at AWS: Amazon remains an e-retailer, and AWS is its IT platform. The cloud companys CEO Andy Jassy was so successful that Amazon founder Jeff Bezos recently handed him the keys to the whole of the business. And while Amazon struggles with overcapacity, AWS is doing just fine, partially benefitting from the extra capacity that makes for extra ammunition in the war for cloud leadership.

AWS has also spotted in the hearts of developers an aspect that CxOs cannot overlook as people continue to build software and their preferences remain crucial. More and more we see AWS using its overall expertise in supply chain management, logistics and warehousing automation to appeal to customers. Amazon has massive internal and organic load from its e-commerce business which gives its cloud a lot of internal scale.

Google Cloud and the vertical twist. In spring 2019, newly minted Google Cloud CEO Thomas Kurian started the vertical cloud promise, whereby enterprises will receive more value from a vertical cloud that knows and automates specific industry aspects. With that he took the initiative, and the rest of the industry was forced to react. This created a veritable additional differentiator for Google Cloud which relied a lot (perhaps too much) on scale and AI.

For AI, Google Cloud is maintaining its two to three year lead for algorithms on custom silicon over AWS and Azure. Google has the inherent advantage in that its cloud needs to scale to premier performance requirements to keep its internal use cases around advertisement, natural language processing, YouTube etc. going.

The result is a premier cloud infrastructure, which includes networking, and a setup where Google boasts its own cables. At the same time, Google needs to take market share from AWS and Azure and therefore is very competitively priced, to the point of Google Cloud continuously measuring losses for parent business Alphabet, as seen once more in the companys most recent quarterly earnings.

The Microsoft metamorphosis. After almost a decade, Microsoft has completed its transformation into a cloud vendor. Its CEO, Satya Nadella, came from the Azure business and led Microsoft into its cloud metamorphosis.

Across AWS and Google, Redmond understands the enterprise better than its competitors, and has decade-long ties into the IT organisation of practically any enterprise on the planet. More importantly, almost all enterprises have some sort of contractual agreement to boot with Microsoft, concerning at least Office and often also more of the Microsoft stack. Compared with its two key competitors, Microsoft has only little in-house organic load (e.g. advertising, Xbox etc.). With its focus on Office, Microsoft is also more advanced and active concerning data privacy and data residency, which comes part and parcel with the nature of the Office business containing sensitive data.

Good ol newbie, Oracle

Oracle has been a long time partner of enterprise IT, carrying the old guard label against competitors like AWS. And while it looked for a long time that the company would join the extensive list of former key IT partners that did not manage to move to the cloud, things have looked up for Oracle in the last five years, with particular focus on the last two.

Despite past and ongoing attempts to replace Oracles database, its competitors have made little to no inroads. In the meantime, Oracle has built out its second-generation cloud, which is the optimum platform to run the Oracle Database. To serve customers, Oracle and Microsoft recently even announced a DBaaS service of Oracle Database running in Azure. This is a good example of the more IT-centric cloud vendors realising that the customer comes first, regardless of past animosities or competition.

In contrast to the other three competitors, Oracle has no organic load for its cloud (its SaaS Apps being the notable exception), something it must make up with large deals (e.g. Zoom, TikTok etc.) while it waits for customers to move workloads to Oracle Cloud. Being a software company, Oracle Cloud is margin-dilutive for Oracle as its equivalents are for Google and Microsoft. Nonetheless, Oracle is investing massively, with its recent quarter showing the companys largest CAPEX expense to date.

The takeaways

CxOs should understand the core differences between these four cloud infrastructure vendors, all the way down to their organisational DNA, which remains immutable and cannot be changed by marketing and products.

All vendors bring distinctive value propositions to enterprises. AWS is and will be the preferred platform by developers for the near future, and, outside of the AI/ML space where they are catching up, has done very well for itself.

Microsoft works well with IT and most enterprises use the vendor already. Google will be ideal for enterprises who want to bet on the AI/ML strategy early. Oracle is a new player, which is definitely relevant for existing Oracle customers, but also for non-Oracle customers as it has the most enterprise-friendly cloud management.

It will be interesting to see how the companies handle disruption along the lines of quantum computing and the metaverse. Hopefully with their services, your enterprise will ride out the storm with aplomb.

Continue reading here:
Premises and potentates for cloud in 2022 ERP Today - ERP Today