Archive for the ‘Quantum Computer’ Category

D-Wave is the third quantum startup to SPAC in less than a year – Fast Company

D-Wave completed a planned merger on Monday with DPCM Capital (the latter of which was already listed on the New York Stock Exchange), making the Canada-basedfirm the third quantum player to go public via a SPACthat is, a special purpose acquisition companywithin the last year. (The other companies? Rigetti and IonQ.)

Its an interesting trend, but perhaps not a surprising one: According to D-Wave CEO Alan Baratz, the until-recently-obscure financial quirk offers his companyone thats in a still-budding sectorfaster access to capital.

In some sense SPACs are ideal for a company that has huge potential but is going to take some time to mature,he tells Fast Company. With a SPAC, youre able to tap into the funding sources in the public markets to accelerate your growth and do it based on the future potential.

A traditional IPO, on the other hand, is all about today, he adds.

SPACs can also save companies money (though this point is subject to some debate). I dont think all SPACs should be discounted, says Patrick Moorhead of Moor Insights & Strategy, a consulting firm. Its a much less expensive way to go public and takes less time and effort.

So far, D-Waves post-SPAC stock is holding its own. It opened at $9.98 Monday and closed at $11.86 on Thursday. But Rigetti and IonQ havent fared as well. Rigetti has seen its shares drop in value by roughly half since its listing on the NASDAQ in March. IonQs shares have lost about 40% of their value since its listing in October 2021.

In the young field of quantum computing, D-Wave has emerged as a major character. Back in 2011, the company became the first to actually sell a quantum computer; it now counts NASA, Google, and Lockheed Martin as customers.

Building and operating a quantum computer is an extraordinary feat of science and engineering. Instead of the bits used in traditional computers (which can be set to zero or one), quantum computers use subatomic particles called qubits, which can represent many values between zero and one, as well as zero and one at the same time (a superposition). Qubits can also entangle to represent values in extremely complex problems. In order to take advantage of these properties, the computer has to control the state of the qubits, whose erratic behavior is governed by quantum physics, not regular physics. This is very hard, and usually involves supercooling the qubits to slow their constant spin, then using lasers or electricity to control their state.

D-Wave was able to get to market with a quantum computer because it adopted a unique approach to working with the qubitsone that asks far less of them. What its looking for is the minimum energy level within a qubit, and by finding the minimum energy level, then theyre able to find the most optimized solution to a problem, says Heather West, research manager at research firm IDC. And thats why D-Wave is able to say they have 5,000 to 7,000 qubits in their system versus an IBM, which is still down around 127.

Even though that approach, called quantum annealing, doesnt try to exert a lot of control over the states of the qubits, its still very useful for solving optimization problemsthat is, problems where the goal is to find the best solution among a huge number of possibles. An optimization problem might be finding the optimal routes and cargos for a large fleet of delivery trucks, or finding the optimal number of employees to schedule on a given day. Its a common type of business puzzle, and annealers are especially good at solving them.

Some of these industries really gravitated toward D-Wave because of those optimization problems, and being able to pull in all sorts of data to find these optimized solutions and solving problems faster was really appealing, West says.

That application is a good example of the way companies are using quantum services like D-Wave today. Theyre looking for problem types where classical computers struggle and quantum computers excel.

They [D-Wave] are really more of an accelerator, says Ashish Nadkarni, group VP and general manager at IDC. We are not at the point where you can completely run all kinds of jobs on a quantum computer.

But D-Waves annealer may eventually be seen as a forerunner to a more robust kind of quantum computing, called gate model, in which the quantum computer takes full advantage of the quantum properties of the qubitstheir many possible states, their capacity for superposition, and the compute power enabled by multiple qubits entangling with each other.

Controlling and leveraging these properties opens the possibility of solving problems that are far beyond the reach of classical supercomputers (and annealers). These are large probabilistic problems where the qubits are asked to model huge and complex data sets. It could be modeling all the receptors in the brain to explore how theyll react to a drug, or a huge array of stock market conditions to predict their effect on the price of a certain commodity.

Realizing that much of the upside and excitement around quantum computing is coming from the possibility to solve such problems, D-Wave announced last year that it had begun building gate-model quantum computers more like the ones built by Google, IBM, and IonQ. D-Wave will need years to develop its gate-model quantum, but Baratz believes offering both annealers and gate-model quantum computing will eventually put his company at an advantage.

By doing both and being the only company thats doing both, were the only company in the world that will be able to address the full market for quantum, and the full set of use cases, he says. D-Waves customers typically tap into these computing services via a dedicated cloud service.

Because quantum is considered a nascent technology, many potential customers (such as companies in the financial services and pharmaceutical industries) are experimenting with running certain types of algorithms on quantum systems to look for some advantage over classical computing. But theyre not necessarily paying customers.

Baratz says that its the gate-model quantum services that are nascent technology, not D-Waves annealers, which he says are ready to deliver real value today. He believes the gate-model quantum computers are still as many as seven years away from being able to run general business applications in a way that beats classical computers.

Baratz believes that D-Wave is now challenged to make sure customers differentiate between gate-model computingwhich he says could be as many as seven years away from running real business applicationsand D-Waves quantum annealing service, which is mature and ready to deliver value today. While his gate-model competitors are out telling customers its okay to dip their toes into the water and experiment, D-Wave must counter that narrative in the marketplace with the message that customers can be doing real optimization work using quantum annealing now.

We truly are commercial, so when our competitors talk about revenue, they talk about government research grants as revenue, and they talk about national labs and academic institutions as customers, Baratz says. When we talk about our customers, we talk about our recently announced deal with MasterCard, or Deloitte or Johnson & Johnson or Volkswagen.

Baratz says over 65% of D-Waves quantum cloud revenue last year came from more than 50 commercial customers, which include over two dozen members of the Forbes Global 2000.

Baratz says D-Wave is now entering a phase in which it can leverage its annealers to start customer relationships.

We do have a significant head start, but we think now is the time to really make the investment to grow that loyal customer base and get the market share, Baratz says.And then, as we bring new generations of annealing to market, its just an upsell to more complex applications as we bring gate [model] to market.

Read more from the original source:
D-Wave is the third quantum startup to SPAC in less than a year - Fast Company

Open hybrid cloud and quantum computing shape future for Red Hat thought leaders – SiliconANGLE News

This years Red Hat Summit gathering in early May provided an opportunity to step back from the enterprise computing treadmill and assess the long-term implications of where network innovation is headed.

Along with news surrounding an edge platform opportunity with General Motors Corp. and the latest release of Red Hat Enterprise Linux, this years gathering in Boston offered a glimpse into the computing future.

Through SiliconANGLEs exclusive onsite coverage of the Summit and a closer analysis of multiple interviews with Red Hat Inc. thought leaders over the past two years on theCUBE, a clearer picture emerges. Three major areas that are high on the priority list for the companys top executives: building platforms for managed services, the open hybrid cloud and quantum computing.

Matt Hicks was recently named Red Hat CEO. Photo: SiliconANGLE

These areas are being driven by Red Hats longtime commitment to the open-source community. Open source remains the companys wellspring, and Red Hat pays close attention to which way the water flows.

For us, when you see open-source projects, they definitely get to a critical mass where you have so much contribution, so much innovation there, theyre going to be able to follow the trends pretty well,Matt Hicks, the newly appointed chief executive of Red Hat, said in aninterview with theCUBE analysts. Thats been our model, though; its to find those projects, be influential in them, be able to drive value in lifecycles.

One project that Red Hat fully supports involves providing services for the managed cloud. The company announced a number of offerings in this area during its Summit event in 2021, and it has continued to build on those with OpenShift as the foundation.

The managed cloud is generally defined as a suite of services with partial or complete management of cloud resources. The new managed cloud services portfolio that Red Hat announced last year meant that OpenShift was now available on all of the major cloud provider platforms.

To further its managed cloud vision, Red Hat launched OpenShift Streams for Apache Kafka, an add-on for OpenShift Dedicated called API Management Service, and OpenShift Data Science. The key behind the companys managed cloud strategy is to provide enterprises with an ability to control the flow of data across different environments.

Clayton Coleman envisions an open hybrid cloud future. Photo: SiliconANGLE

A visible proponent of the managed cloud has been Clayton Coleman (pictured) who, until recently, was Red Hats chief technical officer for hybrid cloud. Coleman took a new position in June as a distinguished engineer at Google LLC.

Were trying to continue to deliver the best experience, the best operational reliability that we can so that the choice of where to run your cloud or where you run your applications matches the decisions youve already made and where your future investments are going to be, Coleman said in an interview with theCUBE at the time of the OpenShift releases. We want to be where customers are but also want to give you that consistency that has been the hallmark of OpenShift since the beginning.

This quest for consistency has infused much of Red Hats strategic focus over the years, as it has pursued open-source innovation from the data center to the cloud and edge. Red Hats vision of the open hybrid cloud is guided by partnerships and technology advances, and one top executive sees this coming from the processor side.

Paul Cormier speaks at Red Hat Summit in Boston, May 2022. Photo: m.albertson

At this years Summit, former CEO and now Red Hat Chairman Paul Cormier (pictured) declared that the open hybrid cloud would be defined by hardware innovation at the edge. Cormiers point was that innovation from processor firms such as Nvidia Corp., Arm Ltd. and Intel Corp. would play a central role in the future of open hybrid cloud. To underscore this point, senior executives from Intel and Nvidia made prominent appearances during the Summit keynote sessions in May, two months after Red Hat added support for OpenShift on Arm processors.

SiliconANGLEs analysis of market data from Enterprise Technology Research last fall pointed to the ascendance of hybrid cloud as an enterprise information technology force. Red Hat is betting that open-source code can provide the foundation for creating systems and environments that seamlessly cross a multitude of platforms.

The company has also backed up its bet with a significant research investment. In April 2021, Red Hat announced the donation of software subscriptions valued at half a billion dollars to Boston University for open hybrid cloud research. The collaboration will focus on operations and systems research using upstream and production environment code.

Its really giving you that secure, flexible, fast innovation backbone for cloud-native computing, Hicks said in an interview about open hybrid cloud in 2021. I hope well see an explosion of innovation that comes out, and I hope customers see the benefits of doing that on an open hybrid cloud model.

Thought leaders within the Red Hat community are also looking beyond managed services and the open hybrid cloud for the next wave of innovation. Computings future will likely be impacted by current research in the quantum field.

Quantum computings potential lies in its ability to retain multiple states, a feature known as superposition. While classical computing models are based on bits with a 1 or a 0, a qubit in quantum can be 1, 0 or both.This sets the stage for a significant boost in computing power and a future tied to quantum supremacy as both Red Hat and IBM pursue research initiatives in this field.

Parul Singh guides Red Hats work toward quantum supremacy. Photo: SiliconANGLE

Quantum supremacy plays a very important role in the roadmap that weve been working on, said Parul Singh (pictured), senior software engineer at Red Hat, in an interview with theCUBE. Lets say that you have any program that you run or any problem that you solve on a classical computer. A quantum computer would give you the results faster, so thats how we define quantum supremacy.

The cloud offers a potential platformto host quantum services. In partnership with IBM, Red Hat has demonstrated how to make quantum systems work through the use of an OpenShift 4 cluster and Qiskit, an open-source software development tool.

IBM has made quantum a significant strategic priority. The company announced plans to deliver a 4,000-plus-qubit system by 2025 and IBM CEO Arvind Krishna spent much of his press briefing during the firms annual conference in May describing progress toward a quantum-fueled future.

There are still challenges ahead for researchers in scaling quantum technology and bringing 4,000 qubits from a simulator to the physical core of a computer. Meanwhile, Red Hat is laying the groundwork to bridge the classical and quantum worlds and democratize the technology for wider use.

Quantum computers are there, but it is not easily accessible for everyone to consume because it is a very new area, Singh said. You have a classical world and a quantum world, and thats where a lot of thought process has been. What we are trying to do is establish best practices so you can have classical components exchanging data with quantum.

See the original post here:
Open hybrid cloud and quantum computing shape future for Red Hat thought leaders - SiliconANGLE News

The truth about quantum risk cryptography and being quantum safe – VentureBeat

Were you unable to attend Transform 2022? Check out all of the summit sessions in our on-demand library now! Watch here.

The creation of classical computing may have paved the way for the modern enterprise, but its also barely scratched the surface of the limits of data processing potential. In the future, quantum computers will amplify the resources that organizations have available to process their data.

While quantum computing will unlock powerful analytics and artificial intelligence (AI) processing capabilities, it also opens the door to serious security vulnerabilities, due to the ability of these computers to decrypt public-key algorithms.

This would give cybercriminals and nation-states the ability to openly decrypt information protected by public-key algorithms not just in the future, but also retrospectively by collecting encrypted data today to decrypt when quantum computers finally reach maturity.

Although researchers estimate that quantum computers could be able to do this as soon as 2030, with the Biden administrations CHIPS and Science Act [subscription required] being approved by Congress last week and setting aside $52 billion in subsidies to support semiconductor manufacturers, and $200 billion to aid research in AI, robotics and quantum computing this development could happen much sooner.

MetaBeat 2022

MetaBeat will bring together thought leaders to give guidance on how metaverse technology will transform the way all industries communicate and do business on October 4 in San Francisco, CA.

The idea of quantum risk dates back to 1994, when mathematician and researcher Peter Shor created Shors algorithm, and discovered that it was theoretically possible to break cryptographic algorithms with number factorization.

This first highlighted the vulnerability of public-key algorithms that werent able to offer this level of factorization. However, not all forms of public-key encryption are as susceptible to exploitation as others, so its important not to panic about quantum risk.

Quantum computers cracking crypto sounds scary and will get people reading, but the reality is much more nuanced. Will some types of QC eventually be able to decode some of todays best crypto? Almost certainly. Will we have time to put measures in place before that happens? Signs point to yes, said Brian Hopkins, Forrester analyst.

Hopkins explains that, on the one hand, asymmetric key encryption algorithms like PKI are the most vulnerable, while symmetric key encryption is much less vulnerable, and one-time pads would remain pretty much unbreakable.

For Hopkins, the main risk posed by quantum computers lies in the fact that small advances in their infrastructure can oustrip classical systems and rapidly change the threat landscape.

If one of these firms [IBM, HPE, IonQ, Rigetti] figures out how to scale high-quality qubits more easily, we could see machines that double or triple in qubit number and quality every year to 18 months, Hopkins said. That means we could go from nothing to oh no in a few months.

Although its unclear when quantum computers will have the ability to decrypt public key algorithms, many commentators are concerned that threat actors and nation-states are in the process of stockpiling data thats encrypted today, which they will then decrypt when quantum computing advances.

One of the biggest risks at present is whats known as a HNDL attack This is an acronym for harvest now, decrypt later, where encrypted data is captured, stored and held onto until a quantum computer is able to unlock it, said Vikram Sharma, founder and CEO, QuintessenceLabs.

While this intercepted data is encrypted, this is a false sense of security; it will easily be decrypted by a threat actor with access to a quantum computer, Sharma said. Above all, new investments in quantum tech and geopolitical motivations mean the quantum risk threat has shifted from no longer if, to when.

One of the challenges around reacting to post-quantum threats is the lack of certainty around the future threat landscape, and what technologies are required to defend against them. Together, these factors make it difficult to justify investment in preventative and defensive post-quantum technologies.

Fortunately, post-quantum cryptography (PQC) solutions, essentially encryption services that cant be decrypted by quantum computers, offer a strong answer to these next-generation threats.

The key to being prepared for the evolving threat landscape is to act quickly. As Sharma said, By the time companies start feeling risk from a quantum computer, it will be much too late, because data that was stolen years ago will have been decrypted.

A simple first step is for organizations to start identifying data assets that could be vulnerable to the decryption of public-key algorithms. Conducting a quantum risk assessment can help them identify the impact a post-quantum incident could have on the organization as a whole.

With this information, security leaders can start to build a business case to justify spending on quantum resilience, identifying the potential financial impact of such an event, and put forward a proposed timeline to adopt any defensive solutions like PQC, quantum key distribution (QKD) or quantum random number generation (QRNG).

Just a month ago, NIST finally announced the first four post-quantum algorithms it would be choosing as its new post-quantum cryptographic standard.

This means those organizations facing advanced persistent threats (from nation-states, in particular) now have guidance on how to select quantum-resistant encryption for their highest-secrecy data moving forward, said Kayne McGladrey, IEEE senior member.

As part of the announcement, NIST selected some core algorithms for enterprise use cases. These include the CRYSTALS-Kyber algorithm for general encryption, and CRYSTALS-Dilithium, FALCON and SPHINCS+ for digital signatures (although it recommended Dilithium as the primary digital signature algorithm).

Vadim Lyubashevsky, a Cryptography Research Scientist at IBM who worked on Cyber and Dilithium, explains that the CRYSTALS-Kyber algorithm is extremely fast, with short public-key and ciphertext sizes, while Dilithium is advantageous over FALCON because its easier to implement and less error-prone.

Though these solutions are effective, Lyubashevsky warns that organizations should expect to mix adoption of quantum encryption alongside traditional public-key algorithms.

Realistically, what organizations should expect to implement are hybrid strategies that blend both quantum-safe protocols with existing cryptographic standards to ensure data is secure and protected against threats that exist now and that will arise in the near future, Lyubashevsky said.

As the era of quantum computing may arrive very soon, it is worth starting early on the journey to move from safe to quantum safe. The first step to get there is education: Understand quantum-safe cryptography and what its implications are for your organization. Partner with cryptographic experts to future-proof data encryption and make decisions that will protect your systems well into the future, Lyubashevsky said.

VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Learn more about membership.

Read the original here:
The truth about quantum risk cryptography and being quantum safe - VentureBeat

Researchers Find Breakthrough on Quantum Computing With Silicon Chips – TechAcute

Researchers from Simon Fraser University were successful in making a breakthrough in the field of quantum technology development. Their study paves the way for creating silicon-based quantum computing processors compatible with the existing semiconductor manufacturing technology.

The researchers light up the silicon chips tiny defects with intense light beams. Stephanie Simmons, the principal investigator of the research, explains that the imperfections of the chips serve as an information carrier. Investigators point out that the tiny defect reflects the transmitted light.

Some of the naturally occurring silicon imperfections may act as quantum bits or qubits. Scientists consider these defects as spin qubits. Also, previous research shows how silicon produces long-lived and stale qubits.

Daniel Higginbottom, their lead author, considers this breakthrough promising. He explains that the researchers were able to combine silicon defects with quantum physics when it was considered to be impossible to do before.

Furthermore, he notes that while silicon defects have been studied extensively from the 1970s to the 1990s and quantum physics research being done for decades, its only now that they saw these two studies come together. He says that by utilizing optical technology in silicon defects[theyve] have found something with applications in quantum technology thats certainly remarkable.

Simmons acknowledges that quantum computing is the future of computers with its capability to solve simple and complex problems, however, its still in its early stages. But with the use of silicon chips, the process can become more streamlined and bring quantum computing faster to the public than expected.

This study demonstrates the possibility of making quantum computers with enough power and scale to manage significant computation. It gives an opportunity for advancements in the fields of cybersecurity, chemistry, medicine, and other fields.

Photo credit: The feature image is symbolic and has been taken by Solar Seven.Sources: Chat News Today / Quantum Inspire

Did this article help you? If not, let us know what we missed.

Originally posted here:
Researchers Find Breakthrough on Quantum Computing With Silicon Chips - TechAcute

Quantum Technologies Market Research Report 2022 – Global Forecast to 2030: Strategic Collaboration, Mergers, and Technology Partnerships -…

DUBLIN--(BUSINESS WIRE)--The "Quantum Technologies Global Market - Forecast to 2030" report has been added to ResearchAndMarkets.com's offering.

The quantum technologies global market is expected to grow at a high double digit CAGR of from 2021 to 2030 to reach $3,518.3 million by 2030.

The factors such as growing government and private venture funding for quantum technologies, increasing R&D expenditure of major technology companies to develop quantum technologies, strategic collaboration, partnerships, and mergers for the quantum technologies are driving the quantum technologies global market.

Whereas, the emergence of mobile and convenient quantum processors and the development of advanced quantum technologies provides immense growth opportunities for the market. The lack of skilled professionals, high cost and complexity associated with the development of quantum technologies, and cryptographic risk associated with quantum communications are hindering the market growth.

The market for quantum technologies is segmented based on technology, products, end-user, and geography. Based on the technology, the market is segmented into Quantum Computing, Quantum Sensing, and Quantum Communication. Among these, the Quantum Sensing segment is accounted for the highest revenue in 2021 and is expected to grow at an early teen CAGR from 2021 to 2030.

Quantum computing is expected to grow at a high double digit CAGR from 2021 to 2030. Based on the types of sensors, the Quantum Sensing global market is further segmented into Atomic Clocks, Magnetic Sensors, PAR Sensors, and Others. Among the sensors, the Atomic Clock segment is accounted for the highest revenue in 2021 and is expected to grow at an early teen CAGR from 2021 to 2030.

Magnetic Sensors is expected to grow at a mid teen CAGR from 2021 to 2030. Quantum computing is further segmented based on application and based on deployment. Based On application, the quantum computing global market is segmented into Machine Learning, Optimization, and Simulations. Among these, the Optimization segment is accounted for the highest revenue in 2021 and is expected to grow high double digit CAGR from 2021 to 2030.

The simulations segment is expected to grow at a high double digit CAGR from 2021 to 2030. Based on the deployment, the quantum computing global market is sub-segmented into on-premise and cloud-based. Among these, cloud-based deployment is accounted for the highest revenue in 2021 and is expected to grow at high double digit CAGR from 2021 to 2030.

Based on product the quantum technologies global market is divided into hardware, software, and services. Among these, the Hardware segment is accounted for the highest revenue of in 2021 and is expected to grow at a mid teen CAGR from 2021 to 2030. The services segment is expected to grow at a high double digit CAGR from 2021 to 2030.

Based on end-users, the quantum technologies global market is segmented into Healthcare, Banking, Financial Services and Insurance (BFSI), Energy, Oil and Gas, Chemical & Material science, Logistics and Distribution, Aerospace, Defense, and Others. Among these, the Aerospace segment is accounted for the highest revenue in 2021 and is expected to grow at a high double digit CAGR from 2021 to 2030. The healthcare segment is expected to grow at a high double digit CAGR from 2021 to 2030.

North America accounted for the largest revenue in 2021 and is expected to grow at a high double digit CAGR from 2021 to 2030. The factors such as increasing R&D expenditure, growing industries, the establishment of quantum research centers, development of national strategy by the government, the presence of major technology companies, increasing collaboration between quantum technology companies and industries, and the increasing number of quantum computing start-ups companies are driving the quantum technologies market in the region.

Europe is expected to grow at a high double digit CAGR from 2021 to 2030. The factors such as growing industry, the launch of quantum research programs with government investment, development of various consortiums by collaborating with large industrial partners, small and medium-sized enterprises (SMEs), start-ups, and research organizations to build a quantum computer into usable industrial applications, increasing private venture funding, the launch of new research programmers, increasing collaboration between quantum technology companies and industries, collaboration with other countries, merging between quantum technology companies to explore different application areas and growing start-up company activities are driving the quantum technologies market in the region.

The quantum technologies global market is competitive and all the players in this market are involved in strategic collaboration, partnership, mergers, and new product launches in quantum technologies to expand their product portfolio and maintain their market shares.

Factors Influencing Market

Drivers and Opportunities

Restraints and Threats

Porter's Five Force Analysis

Patent Analysis

Funding Analysis

Deal Analysis

Quantum Technologies (New Product Launch)

Quantum Technology Partnerships

Matrix of Quantum Technologies Companies

Market Share Analysis Based on Major Players

The key players in the quantum technologies global market include

Companies Mentioned

For more information about this report visit https://www.researchandmarkets.com/r/ib6vc3

More here:
Quantum Technologies Market Research Report 2022 - Global Forecast to 2030: Strategic Collaboration, Mergers, and Technology Partnerships -...