Archive for the ‘Quantum Computing’ Category

ST Explains: How will quantum computing contribute to vaccine, EV development? – The Straits Times

SINGAPORE - Singapore is stepping up its investments in quantum computing.

Chiefly, it will have a foundry to develop the components and materials needed to build quantum computers to establish an ecosystem of activities in the emerging field.

Singapore will also join a handful of nations - the United States, China, France, Finland, Germany, South Korea and Japan - in building its own quantum computer to gain first-hand experience with the technology.

The Straits Times explains what quantum computing is, and the benefits the technology brings.

It is similar to traditional computing but operating at the far cooler temperature of nearly absolute zero, the temperature at which a thermodynamic system has the lowest energy corresponding to minus 273.15 deg C.

Under layers of casing and cryogenic components to attain this super cool state - colder than in outer space - quantum objects (an electron or a particle of light) are manipulated to execute complex mathematical calculations out of reach of traditional computers.

Traditional computers store information as either 0s or 1s. Quantum computers, on the other hand, use quantum bits (or qubits) to represent and store information in a complex mix of 0s and 1s simultaneously. As the number of qubits grows, a quantum computer becomes exponentially more powerful.

Quantum computing's long development history dates back to the 1970s, when the late American physicist Paul Anthony Benioff demonstrated the theoretical possibility of quantum computers.

By harnessing quantum physics, quantum computing has the potential to comb vast numbers of possibilities in hours and pinpoint a probable solution. It would take a traditional computer hundreds of thousands of years to perform a similar task.

Japan's first prototype quantum computer, unveiled in 2017, could make complex calculations 100 times faster than a conventional supercomputer.

Google's quantum computer created in 2019 could perform in 200 seconds a computation that would take the world's fastest supercomputers about 10,000 years.

A year later, in 2020, a team at the University of Science and Technology of China assembled a quantum computer that could perform in 200 seconds a calculation that an ordinary supercomputer would have taken 2.5 billion years to complete.

But none of these machines was given practical tasks.

See more here:
ST Explains: How will quantum computing contribute to vaccine, EV development? - The Straits Times

Quantum Computing Market Growth Status, Business Prospects, and Forecast 2020-2025 The Colby Echo News – The Colby Echo News

Request To Download Sample of This Strategic Report: https://www.astuteanalytica.com/request-sample/quantum-computing-market

The study covers a detailed segmentation of the quantum computing market, along with country analysis, key information, and a competitive outlook. The report mentions the company profiles of key players that are currently dominating the quantum computing market, wherein various development, expansion, and winning strategies practiced and executed by leading players have been presented in detail.

Key Questions Answered in this Report on Quantum Computing Market

The report provides detailed information about the quantum computing market on the basis of comprehensive research on various factors that play a key role in accelerating the growth potential of the market. Information mentioned in the report answers path-breaking questions for companies that are currently functioning in the market and are looking for innovative ways to create a unique benchmark in the quantum computing market, so as to help them make successful strategies and take target-driven decisions.

Download Sample Report, SPECIAL OFFER (Avail an Up-to 30% discount on this report-https://www.astuteanalytica.com/request-sample/quantum-computing-market

Research Methodology Quantum Computing Market

The research methodology adopted by analysts to compile the quantum computing market report is based on detailed primary as well as secondary research. With the help of in-depth insights of the industry-affiliated information that is obtained and legitimated by market-admissible sources, analysts have offered riveting observations and authentic forecasts of the quantum computing market.

During the primary research phase, analysts interviewed industry stakeholders, investors, brand managers, vice presidents, and sales and marketing managers. On the basis of data obtained through the interviews of genuine sources, analysts have emphasized the changing scenario of the quantum computing market.

For secondary research, analysts scrutinized numerous annual report publications, white papers, industry association publications, and company websites to obtain the necessary understanding of the quantum computing market.

Request Full Report-https://www.astuteanalytica.com/request-sample/quantum-computing-market

About Astute Analytica:

Astute Analytica is a global analytics and advisory company that has built a solid reputation in a short period, thanks to the tangible outcomes we have delivered to our clients. We pride ourselves in generating unparalleled, in-depth, and uncannily accurate estimates and projections for our very demanding clients spread across different verticals. We have a long list of satisfied and repeat clients from a wide spectrum including technology, healthcare, chemicals, semiconductors, FMCG, and many more. These happy customers come to us from all across the globe.

They are able to make well-calibrated decisions and leverage highly lucrative opportunities while surmounting the fierce challenges all because we analyze for them the complex business environment, segment-wise existing and emerging possibilities, technology formations, growth estimates, and even the strategic choices available. In short, a complete package. All this is possible because we have a highly qualified, competent, and experienced team of professionals comprising business analysts, economists, consultants, and technology experts. In our list of priorities, you-our patron-come at the top. You can be sure of the best cost-effective, value-added package from us, should you decide to engage with us.

Get in touch with us:

Phone number:+18884296757

Email:sales@astuteanalytica.com

Visit our website:https://www.astuteanalytica.com/

Visit link:
Quantum Computing Market Growth Status, Business Prospects, and Forecast 2020-2025 The Colby Echo News - The Colby Echo News

Premises and potentates for cloud in 2022 ERP Today – ERP Today

Quantum computing,AI, DAOs and the metaverse make for an interestingcloud race after COVID-19

Two years ago, when I last wrote about the state of the cloud race for ERP Today, the landscape was characterised by the amount of capital that enterprises needed to save, which led to more workloads being driven to the cloud. Converting CAPEX into OPEX has proved to be a winning strategy for enterprises, enabling them to drive enterprise acceleration.

But since then, we have seen the era of infinite computing begin, meaning enterprises now need to build next generation applications to stay in the game. Meanwhile the war in Ukraine, COVID-19, rising interest rates and fears of an upcoming recession havent increased the appetite of boards to put capital into data centres, not when the cloud offers a workable, viable and in almost all cases, a safe opportunity for enterprises to operate workloads in the 21st century.

Lets look at the state of play as reached in the last two years, and the next-gen disruptions that are acting as game changers in cloud.

COVID-19 boosted the cloud

Uncertainty is a strong impetus to review the status quo. As many an enterprise struggled to both redefine itself and operate successfully under pandemic circumstances, the ability to tie operating costs to business performance has proved to be both highly desirable and critical. Moreover, board members who took a more cavalier approach towards the IT conversation on moving to the cloud have become keen experts on CAPEX allocation in the last few years, and in many cases had their eureka moment when it came to IT costs.

The results have driven a harder push to move workloads to the cloud than ever before. Not only did COVID-19 boost (pun intended) the move to the cloud, but also the Great Re-Assessment of employees towards working life. When people work from home, switch jobs more often and redefine their work/life balance, it becomes even more of a challenge to fill IT operations jobs to run on-premise data centres. Here is what CxOs are doing or should be doing to move to the cloud in these changing times:

Talk to your apps vendor. It isnt only enterprises that have to move to the cloud, but also the application vendors as well. Obviously they will be keen to keep their customers, and will offer ways to move an enterprise to the cloud.

CxOs should evaluate these offerings and consider taking advantage of them if they see their enterprise working long-term with its current business application vendor.

Select SaaS vendors that use IaaS vendors. As I laid out in my article two years ago, it matters greatly when it comes to who is responsible for producing CAPEX. It should not be an enterprises future SaaS vendor, as they would be better off investing in their software rather than infrastructure.

CxOs therefore need to ask their SaaS vendor if they run on another IaaS vendor, because if they do not, then they cant offer commercial elasticity. Which is what the move to the cloud is all about: pay less when you use less, and only pay more for IT when you use more IT.

Start building in the cloud. Software is eating the world, and enterprises must create next generation applications to differentiate themselves and operate new digital business models.

Suffice to say that CxOs need to build these applications in the cloud, with modern tools and with cloud-inherent economic mechanics as this is the only way to avoid creating bigger CAPEX challenges and higher migration loads to the cloud in the long run.

The cloud is essential to fuel the move to AI as it enables infinite insights, the economical data storage of all things digital in the enterprise

The AI imperative forces the enterprise into the cloud

A lot has been said about the impact of artificial intelligence (AI) on the enterprise. The long wait and anticipation phase is ending, and the benefits are becoming real so real that enterprises which dont take advantage of AI will struggle to remain relevant towards the end of this decade.

The cloud is essential to fuel the move to AI as it enables infinite insights, the unlimited, economical data storage of all things digital in the enterprise, while not knowing what the queries to the data will be, in short fuelled by Hadoop data-scalable technologies.

AI also fosters infinite compute, the ability to ramp up and ramp down computing infrastructure to fuel AI processes in the volume as enterprises need it.

Quantum computing is becoming increasingly relevant

The next generation of computing paradigm that will be relevant for the enterprise is quantum computing. The technology has matured fast from a pure speeds and feeds phase only two years ago, to the first commercial use cases being available since spring 2022.

Quantum computing will be the first computing platform that enterprises will not adopt on-premise (except for some government use cases, of course, and perhaps deep-pocketed banks and pharma enterprises) but from the cloud. CxOs who want their enterprise to be able to take advantage of quantum computing better move their enterprise to the cloud, starting with data.

Deep learning is deep loomingon the horizon

The ability of software to learn from data and then determine and even automate the right course of action is deep learning. Deep learning will be key for enterprises that compete to give their employees an attractive workplace with a compelling work/life balance, and the automation for that can only come from deep learning.

To be ready for deep learning, CxOs need to move workloads and data to the cloud so their enterprise can take advantage of deep learning capabilities and thrive in the marketplace with both its employees and customers.

DAOs are powered by the cloud

Decentral autonomous organisations (DAOs) operate in the cloud, as they need to minimise CAPEX and need both the architectural and commercial economics only the cloud offers. And while the decentralised approach seems alien to the traditional enterprise, innovative CxOs will make sure that their enterprise can take advantage of both the talent and capital currently flowing into DAOs.

As the decentralisation trend will only get stronger, it is even more relevant to have both an enterprises data and processes in the cloud so that it can take advantage of DAO dynamics powered on the blockchain.

The metaverse will transformall business

The metaverse merits a whole article by itself, and while we know very little about it, with it likely to be the last of these mega trends to materialise, we know one thing for sure already: the metaverse runs in the cloud, so to hedge and be ready whenever the metaverse fully lands, CxOs need to make sure their enterprise runs in the cloud.

Quantum computing has matured fast from a pure speeds and feeds phase, to the first commercial use cases being available since spring 2022

Handicapping the Big Three and a not-so-young newcomer

The rich got richer in the cloud these last few years. As such, Amazons AWS, Googles Google Cloud and Microsofts Azure businesses are bigger and more relevant than ever before. The newcomer to the game is Oracles Oracle Cloud, which has earned its spot at the table by building one of the most enterprise-friendly public clouds out there.

As you already know, when understanding technology vendors, it is always good to look at their organisational DNA alongside pure capabilities:

Amazon AWS leads and keeps leading. Nothing has changed at AWS: Amazon remains an e-retailer, and AWS is its IT platform. The cloud companys CEO Andy Jassy was so successful that Amazon founder Jeff Bezos recently handed him the keys to the whole of the business. And while Amazon struggles with overcapacity, AWS is doing just fine, partially benefitting from the extra capacity that makes for extra ammunition in the war for cloud leadership.

AWS has also spotted in the hearts of developers an aspect that CxOs cannot overlook as people continue to build software and their preferences remain crucial. More and more we see AWS using its overall expertise in supply chain management, logistics and warehousing automation to appeal to customers. Amazon has massive internal and organic load from its e-commerce business which gives its cloud a lot of internal scale.

Google Cloud and the vertical twist. In spring 2019, newly minted Google Cloud CEO Thomas Kurian started the vertical cloud promise, whereby enterprises will receive more value from a vertical cloud that knows and automates specific industry aspects. With that he took the initiative, and the rest of the industry was forced to react. This created a veritable additional differentiator for Google Cloud which relied a lot (perhaps too much) on scale and AI.

For AI, Google Cloud is maintaining its two to three year lead for algorithms on custom silicon over AWS and Azure. Google has the inherent advantage in that its cloud needs to scale to premier performance requirements to keep its internal use cases around advertisement, natural language processing, YouTube etc. going.

The result is a premier cloud infrastructure, which includes networking, and a setup where Google boasts its own cables. At the same time, Google needs to take market share from AWS and Azure and therefore is very competitively priced, to the point of Google Cloud continuously measuring losses for parent business Alphabet, as seen once more in the companys most recent quarterly earnings.

The Microsoft metamorphosis. After almost a decade, Microsoft has completed its transformation into a cloud vendor. Its CEO, Satya Nadella, came from the Azure business and led Microsoft into its cloud metamorphosis.

Across AWS and Google, Redmond understands the enterprise better than its competitors, and has decade-long ties into the IT organisation of practically any enterprise on the planet. More importantly, almost all enterprises have some sort of contractual agreement to boot with Microsoft, concerning at least Office and often also more of the Microsoft stack. Compared with its two key competitors, Microsoft has only little in-house organic load (e.g. advertising, Xbox etc.). With its focus on Office, Microsoft is also more advanced and active concerning data privacy and data residency, which comes part and parcel with the nature of the Office business containing sensitive data.

Good ol newbie, Oracle

Oracle has been a long time partner of enterprise IT, carrying the old guard label against competitors like AWS. And while it looked for a long time that the company would join the extensive list of former key IT partners that did not manage to move to the cloud, things have looked up for Oracle in the last five years, with particular focus on the last two.

Despite past and ongoing attempts to replace Oracles database, its competitors have made little to no inroads. In the meantime, Oracle has built out its second-generation cloud, which is the optimum platform to run the Oracle Database. To serve customers, Oracle and Microsoft recently even announced a DBaaS service of Oracle Database running in Azure. This is a good example of the more IT-centric cloud vendors realising that the customer comes first, regardless of past animosities or competition.

In contrast to the other three competitors, Oracle has no organic load for its cloud (its SaaS Apps being the notable exception), something it must make up with large deals (e.g. Zoom, TikTok etc.) while it waits for customers to move workloads to Oracle Cloud. Being a software company, Oracle Cloud is margin-dilutive for Oracle as its equivalents are for Google and Microsoft. Nonetheless, Oracle is investing massively, with its recent quarter showing the companys largest CAPEX expense to date.

The takeaways

CxOs should understand the core differences between these four cloud infrastructure vendors, all the way down to their organisational DNA, which remains immutable and cannot be changed by marketing and products.

All vendors bring distinctive value propositions to enterprises. AWS is and will be the preferred platform by developers for the near future, and, outside of the AI/ML space where they are catching up, has done very well for itself.

Microsoft works well with IT and most enterprises use the vendor already. Google will be ideal for enterprises who want to bet on the AI/ML strategy early. Oracle is a new player, which is definitely relevant for existing Oracle customers, but also for non-Oracle customers as it has the most enterprise-friendly cloud management.

It will be interesting to see how the companies handle disruption along the lines of quantum computing and the metaverse. Hopefully with their services, your enterprise will ride out the storm with aplomb.

Continue reading here:
Premises and potentates for cloud in 2022 ERP Today - ERP Today

Quantum Computing’s Impact Could Come Sooner Than You Think – CNET

In 2013, Rigetti Computing began its push to make quantum computers. That effort could bear serious fruit starting in 2023, the company said Friday.

That's because next year, the Berkeley, California-based company plans to deliver both its fourth-generation machine, called Ankaa, and an expanded model called Lyra. The company hopes those machines will usher in "quantum advantage," when the radically different machines mature into devices that actually deliver results out of the reach of conventional computers, said Rigetti founder and Chief Executive Chad Rigetti.

Quantum computers rely on the weird physics of ultrasmall elements like atoms and photons to perform calculations that are impractical on the conventional computer processors that power smartphones, laptops and data centers. Advocates hope quantum computers will lead to more powerful vehicle batteries, new drugs, more efficient package delivery, more effective artificial intelligence and other breakthroughs.

So far, quantum computers are very expensive research projects. Rigetti is among a large group scrambling to be the first to quantum advantage, though. That includes tech giants like IBM, Google, Baidu and Intel and specialists like Quantinuum, IonQ, PsiQuantum, Pasqal and Silicon Quantum Computing.

"This is the new space race," Rigetti said in an exclusive interview ahead of the company's first investor day.

For the event, the company is revealing more details about its full technology array, including manufacturing, hardware, the applications its computers will run and the cloud services to reach customers. "We're building the full rocket," Rigetti said.

Although Rigetti isn't a household name, it holds weight in this world. In February, Rigetti raised $262 million and became one of a small number of publicly traded quantum computing companies. Although the company has been clear its quantum computing business is a long-term plan, investors have become more skeptical. Its stock price has dropped by about three quarters since going public, hurt most recently when Rigetti announced the delay of a $4 million US government contract that would have accounted for much of the company's annual revenue of about $12 million to $13 million.

The company argues it's got the right approach for the long run, though. It starts in early 2023 with Ankaa, a processor that includes 84 qubits, the fundamental data processing element in a quantum computer. Four of those ganged together are the foundation for Lyra, a 336-qubit machine. The names are astronomical: Ankaa is a star, and Lyra is a constellation.

Rigetti doesn't promise quantum advantage from the 336 qubit machine, but it's the company's hope. "We believe it's absolutely within the realm of possibility," Rigetti said.

Having more qubits is crucial to more sophisticated algorithms needed for quantum advantage. Rigetti hopes customers in the finance, automotive and government sectors will be eager to pay for that quantum computing horsepower. Auto companies could research new battery technologies and optimize their complex manufacturing operations, and financial services companies are always looking for better ways to spot trends and make trading decisions, Rigetti said.

Rigetti plans to link its Ankaa modules into larger machines: a 1,000-qubit computer in 2025 and a 4,000-qubit model in 2027.

Rigetti isn't the only company trying to build a rocket, though. IBM has a 127-qubit quantum computer today, with plans for a 433-qubit model in 2023 and more than 4,000 qubits in 2025. Although qubit count is only one measure of a quantum computer's utility, it's an important factor.

"What Rigetti is doing in terms of qubits pales in comparison to IBM," said Moor Insights & Strategy analsyt Paul Smith-Goodson.

Along with those machines, Rigetti expects developments in manufacturing, including a 5,000-square-foot expansion of the company's Fremont, California, chip fabrication facility now underway, improvements in the error correction technology necessary to perform more than the most fleeting quantum computing calculations, and better software and services so customers can actually use its machines.

Rigetti Computing's plans for improvements to its broad suite of quantum computing technology.

To reach its goals, Rigetti also announced four new deals at its investor event:

Qubits are easily perturbed, so coping with errors is critical to quantum computing progress. So is a better foundation less prone to errors. Quantum computer makers track that with a measurement called gate fidelity. Rigetti is at 95% to 97% fidelity today, but prototypes for its fourth-generation Ankaa-based systems have shown 99%, Rigetti said.

In the eyes of analyst Smith-Goodson, quantum computing will become useful eventually, but there's plenty of uncertainty about how and when we'll get there.

"Everybody is working toward a million qubit machine," he said. "We're not sure which technology is really going to be the one that is going to actually make it."

See the original post:
Quantum Computing's Impact Could Come Sooner Than You Think - CNET

Are AI and Quantum Computing Infrastructure? The Feds Say Yes – MeriTalk

From the White House to the boathouse, infrastructure has traditionally been narrowly defined as the roads, bridges, waterways, and other projects that allowed a post-industrial America to flourish.

Not anymore.

In the latest sign that the technology revolution is moving in new directions, a six-line law with no name is helping to redefine the traditional notions of infrastructure to include artificial intelligence, quantum computing, and semiconductors.

The legislation, quietly signed by President Biden last month, amended a 2015 law widely known as a highway bill, as befitted its name: the Fixing Americas Surface Transportation (FAST) Act.

A provision of the law provides for expedited Federal environmental and permitting review for covered infrastructure construction projects. Currently, those projects include some of the largest, most complex, and novel infrastructure projects in the U.S., such as massive pipelines and multibillion-dollar renewable energy projects, according to a Federal steering council overseeing them.

Now, with the recent change in the law, the projects potentially qualifying for speeded-up review also encompass semiconductors, artificial intelligence and machine learning, high-performance computing and advanced computer hardware and software, quantum information science and technology, data storage and data management, (and) cybersecurity.

The amended law, titled only An Act, added those computer-related projects.

The legislations sponsor, Sen. Bill Hagerty, R-Tenn., says his intention is to boost national security, especially by fast-tracking permitting reviews of semiconductor plants expected to be built because of the Chips and Science Act. That law, also signed by Biden last month, provided funding incentives to establish such plants.

I came to Washington to create jobs for the American people and bolster our national security to beattheChinese Communist Partyin the competition that will define the century, Hagerty said after Biden signed the FAST Act law on Aug. 16. His office called the FAST Act legislation a watershed bill that enacts regulatory reform that benefits private-sector companies building products that are essential to American national and economic security.

A technology industry expert familiar with the legislation downplayed its effects, saying that Hagertys bill does not represent a collective movement to recast what critical infrastructure looks like. I think that smartly, what youre starting to see is more the ability to leverage technology as components of broader infrastructure projects. It doesnt make the components themselves infrastructure.

But the official summary of the bill by the respected Congressional Research Service calls AI, semiconductors and the other new technology projects now covered by the FAST Act infrastructure projects.

And infrastructure experts say that redefinition has the potential to fast-track a variety of tech projects beyond the scope of what has long been considered critical infrastructure.

Anthony Lamanna, a professor at the Del E. Webb School of Construction at Arizona State University, says infrastructure has traditionally been viewed as the built environment for civilization your water, your sewage, your electric.

When he first read the FAST Act revision, Lamanna says, I have a background in concrete and construction, so my gut was that the tech stuff doesnt really fit.

On further reflection, he says, Maybe we start looking at this as the chip manufacturers are part of this future cyber infrastructure I think somebody coming up with this stuff seems to be thinking far into the future. By fast-tracking these projects, were saying this is important to civilization in the future.

Adie Tomer, a senior fellow and infrastructure expert at the Brookings Institution, likened the language in Hagertys bill to last years high-profile Infrastructure Investment and Jobs Act, which he says makes it explicit that the way infrastructure is construed is that broadband and digital technology is considered infrastructure.

Clearly, we are modernizing our definition of physical infrastructure to include digital tech, says Tomer, who supports the change but says it also bears further scrutiny because data storage facilities and other projects potentially covered by Hagertys bill are privately owned.

What should be the Federal relationship with the private owners of those kinds of facilities? Tomer asked. I dont think its necessarily clear yet Its a critical area to watch.

On the day Biden signed the Infrastructure Investment and Jobs Act last year, a White House blog post hailing the legislation focused almost exclusively on projects such as roads, bridges, and rail, along with broadband.

But a MeriTalk review of the legislation shows that the word digital appears 144 times, including a Federal requirement to adopt digital management systems on construction sites using state-of-the-art automated and connected machinery and optimized routing software.

The bill also requires the administration to report to Congress on using digital tools and platforms as climate solutions, including AI and blockchain technologies.

The move towards redefining infrastructure for the tech age echoes recent developments in Europe, where the European Union adopted tougher cybersecurity rules for network and information systems. The European Commission, which proposed the measures, defined them as critical infrastructure protection that would make Europe fit for the digital age.

In Washington, the FAST Act legislation was introduced in the Senate by Hagerty and several co-sponsors on Jan. 10 and passed the same day by unanimous consent. After a brief floor debate, it cleared the House in July by a vote of 303-89.

During the debate, Rep. Jim Costa, D-Calif., called the legislation a commonsense bill that will build on the progress we are already making today with the CHIPS and Science Act.

The bill here simply adds key national security-related technologies, like semiconductors, to the types of projects that are eligible for an existing Federal program that improves the coordination between Federal departments on permitting, Costa added.

That environmental review and permitting process, Hagerty has said, should be much speedier for the tech projects now covered by his bill, dramatically (reducing) the time required to stand up new manufacturing capacity in strategically critical sectors, such as semiconductor fabrication.

The Federal Permitting Improvement Steering Council oversees that expedited permitting process. When it added another industry to the eligible projects last year, it chose one decidedly more traditional than high tech: mining.

Mining is an important infrastructure sector, the body wrote.

Continued here:
Are AI and Quantum Computing Infrastructure? The Feds Say Yes - MeriTalk