Archive for the ‘Quantum Computing’ Category

Zapata Computing and KAUST Partner to Bring Quantum Computing to the Middle East for the Advancement of Computational Fluid Dynamics – Yahoo Finance

Using Zapatas quantum workflows platform, Orquestra, KAUST will explore how quantum computing can simulate and optimize the aerodynamic design process for vehicles

BOSTON, March 23, 2021 (GLOBE NEWSWIRE) -- Zapata Computing, Inc., the leading enterprise software company for NISQ-based quantum applications, today announced a new partnership with Middle East-based King Abdullah University of Science and Technology (KAUST) to be a licensed user of Zapatas Orquestra, the modular, workflow-based platform for applied quantum computing. KAUST is examining various lines of research to determine how quantum technologies could represent an advantage over classical compute tools in a variety of Computational Fluid Dynamics (CFD) use cases for airplane and automobile aerodynamic design.

Currently, CFD computations are extremely time-consuming and expensive to run. The simulation process is inefficient, and a lot of time is wasted trying to model air flow around wings and engines more efficiently. Boosting work around those designs could allow manufacturers to build more energy-efficient airplanes and lead to lowered carbon emissions for air travel therefore, having an enormous positive impact on the environment. Airplane transportation is overall responsible for 2% of greenhouse gas emissions. For airlines and plane manufacturers this could drive meaningful financial and environmental results all supported by new quantum technology.

Home to the KAUST Research and Technology Park (KRTP) where R&D centers, corporates and start-ups choose to locate themselves, the university has a track record of collaborating with industry partners at national and international levels to transfer research-based technology into the market to achieve public benefit.

We are delighted to be the catalyst for bringing quantum capabilities to CFD research in the Kingdom of Saudi Arabia and to the Middle East, said Kevin Cullen, vice president of Innovation and Economic Development at KAUST. This partnership establishes Zapata as one of the first quantum computing companies active in the region and will enable KAUST researchers to explore the future of aerospace fluid dynamics. KAUST is a leader in the areas of data analysis and AI and we welcome the addition of Zapatas Orquestra technology to our capabilities, in order to accelerate discovery and innovation in these fields.

Story continues

Zapatas Orquestra platform improves data analytics performance, empowering companies and research organizations to build quantum-enabled workflows, execute them across the full range of quantum and classical devices, and then collect and analyze resulting data. With Orquestra, organizations can leverage quantum capabilities to generate augmented data sets, speed up data analysis, and construct better data models for a range of applications. Importantly, it provides organizations with the most flexible, applied toolset in quantum computing so that its users can build quantum capabilities without getting locked in with a single vendor or architecture in the next several years.

We are always looking to expand quantum computing use cases through Orquestra and our work with KAUST will give us a head start to explore new opportunities for more efficient CFD, said Christopher Savoie, co-founder and CEO, Zapata. The collaboration with KAUST will benefit the aerospace industry as a whole by using quantum to bring efficiency to what has historically been a slow and difficult process.

About Zapata Computing Zapata Computing, Inc. builds quantum-ready applications for enterprise deployment through our flagship product Orquestra the only workflow-based toolset for enterprise quantum computing. Zapata has pioneered a new quantum-classical development and deployment paradigm that focuses on a range of use cases, including ML, optimization and simulation. Orquestra integrates best-in-class classical and quantum technologies including Zapata's leading-edge algorithms, open-source libraries in Python and Julia, and more. Zapata partners closely with hardware providers across the quantum ecosystem such as Amazon, Google, Honeywell, IBM, IonQ, Microsoft and Rigetti. Investors include BASF Venture Capital, Honeywell Ventures, Itochu Corporation and Merck Global Health. Enterprise customers include Merck, BP, BBVA, KAUST and Coca Cola Bottlers Japan Inc., among others.

For more information visit http://www.ZapataComputing.com and http://www.Orquestra.io.

About KAUSTEstablished in 2009, King Abdullah University of Science and Technology (KAUST) is a graduate research university devoted to finding solutions for some of the worlds most pressing scientific and technological challenges in the areas of food, water, energy and the environment. With 19 research areas related to these themes and state of the art labs, KAUST has created a collaborative and interdisciplinary problem-solving environment, which has resulted in over 11,000 published papers to date.

With over 100 different nationalities living, working and studying on campus, KAUST has brought together the best minds and ideas from around the world with the goal of advancing science and technology through distinctive and collaborative research. KAUST is a catalyst for innovation, economic development and social prosperity in Saudi Arabia and the world. For additional information, visit: http://www.Kaust.edu.sa

Media Contact: Anya Nelson Scratch Marketing + Media for Zapata Computing anyan@scratchmm.com617.817.6559

Continued here:
Zapata Computing and KAUST Partner to Bring Quantum Computing to the Middle East for the Advancement of Computational Fluid Dynamics - Yahoo Finance

Zapata and KAUST to bring quantum computing to the region – Construction Business News

Saudi-based King Abdullah University of Science and Technology (KAUST) will be a licensed user of Zapata Computings Orquestra, the modular, workflow-based platform for applied quantum computing.

KAUST is examining various lines of research to determine how quantum technologies could represent an advantage over classical compute tools in a variety of Computational Fluid Dynamics (CFD) use cases for airplane and automobile aerodynamic design.

Currently, CFD computations are extremely time-consuming and expensive to run. The simulation process is inefficient, and a lot of time is wasted trying to model air flow around wings and engines more efficiently.

Boosting work around those designs could allow manufacturers to build more energy-efficient airplanes and lead to lowered carbon emissions for air travel therefore, having an enormous positive impact on the environment.

Airplane transportation is overall responsible for 2% of greenhouse gas emissions. For airlines and plane manufacturers this could drive meaningful financial and environmental results all supported by new quantum technology.

Home to the KAUST Research and Technology Park (KRTP) where R&D centres, corporates and start-ups choose to locate themselves, the university has a track record of collaborating with industry partners at national and international levels to transfer research-based technology into the market to achieve public benefit.

We are delighted to be the catalyst for bringing quantum capabilities to CFD research in the Kingdom of Saudi Arabia and to the Middle East, said Kevin Cullen, Vice President of Innovation and Economic Development at KAUST.

This partnership establishes Zapata as one of the first quantum computing companies active in the region and will enable KAUST researchers to explore the future of aerospace fluid dynamics. KAUST is a leader in the areas of data analysis and AI and we welcome the addition of Zapatas Orquestra technology to our capabilities, in order to accelerate discovery and innovation in these fields.

Zapatas Orquestra platform improves data analytics performance, empowering companies, and research organisations to build quantum-enabled workflows, execute them across the full range of quantum and classical devices, and then collect and analyze resulting data.

With Orquestra, organizations can leverage quantum capabilities to generate augmented data sets, speed up data analysis, and construct better data models for a range of applications. Importantly, it provides organizations with the most flexible, applied toolset in quantum computing so that its users can build quantum capabilities without getting locked in with a single vendor or architecture in the next several years.

We are always looking to expand quantum computing use cases through Orquestra and our work with KAUST will give us a head start to explore new opportunities for more efficient CFD, said Christopher Savoie, co-founder and CEO, Zapata. The collaboration with KAUST will benefit the aerospace industry as a whole by using quantum to bring efficiency to what has historically been a slow and difficult process.

Go here to read the rest:
Zapata and KAUST to bring quantum computing to the region - Construction Business News

The Pillars of Future Cryptography at IBM – InfoQ.com

In a recent webinar, IBM has summarized the latest advances in cryptographic technologies the company has been working on, including confidential cryptography, quantum-safe encryption, and fully homomorphic cryptography.

According to Gosia Steinder, IBM Hybrid Cloud Research CTO, each of those technologies is solving a different piece of the security equation.

Confidential computing is IBM moniker for security enclave-based cryptography in the Cloud:

Confidential computing provides hardware-level privacy assurance by encrypting data within a secure enclave that not even the cloud provider can view or access.

This enables users to run workloads in the cloud or on-premises with the maximum privacy and control even when they don't own the infrastructure they are using, says Hillery Hunter, IBM VP and CTO of IBM Cloud.

Confidential computing is not only relevant to guarantee data privacy on the Cloud but also to ensure data integrity and to prevent anyone from tampering with the data, says Samuel Brack, CTO of open-source financial platform DIA. The alternative to using confidential cryptography would be a decentralized approach with increased costs and reduced performance, he adds.

Looking at the future, quantum computing is known to pose a serious challenge to cryptography, says IBM cryptography researcher Vadim Lyubashevsky. As he explains, some of today's cryptography is based on factoring, a problem which is considered hard on classical computers but quantum computers can effectively solve. For example, says Lyubashevsky, a prime integer with a thousand digits could require billions of years to be factored on classical hardware, while a quantum computer could in a couple of hours.

A particularly worrisome dimension of this is highlighted by Dustin Moody, mathematician at NIST, who is working at defining standards for post-quantum cryptography. Indeed, while quantum hardware is not yet there, the mere possibility of its existence means encrypted data is potentially under a threat of attack now. In fact, somebody could take hold of that data and wait for quantum hardware to be available to decrypt it. As a consequence of this, he says, you may not be protecting your data for the amount of time you hope you do.

As Moody recounts, NIST is running an open process to select the best crypto systems, based on security and performance. Currently there are seven encryption schemes that advanced to round 2 in the selection process, out of 69 initial competitors. The expectation is to be able to have a draft standard for the first quantum resistant algorithms at the beginning of 2022, with the prospect of completing its standardization by 2024 after a process of public comment.

Transition will not be easy, though, says Moody:

We're dealing with algorithms that are a lot more complex in terms of the math they use and some of the characteristics that they have they also have things like larger key sizes so we as much as possible we're trying to prepare as much as we can and encourage others to do so.

Four of the quantum-safe algorithms that made it to phase 2 were initially proposed by IBM, highlights Lyubashevsky, and they are available through the open source Cryptographic suite for algebraic lattices (CRYSTALS).

These schemes derive their security from the fact that they are based on the presumed algorithmic hardness of something called lattice problems.

In other words, counter to integer factoring, lattice problems are thought to be hard even for quantum computers. To understand what lattice problems look like, Lyubashevsky suggests a simple example. Say you have a public list of six numbers. You pick three of them and then calculate their sum. The problem consists in finding which three numbers you chose from their sum. When you deal with thousands of thousand-digit numbers, it seems this problem would be hard for quantum computers. Lattice problems are just one possible approach to post-quantum cryptography.

As mentioned, IBM is providing an implementation for CRYSTALS, which makes it possible to carry through experiments to assess their performance.

We've noticed that the efficiency of the schemes is such that the end user won't notice any difference. In fact, sometimes the new scheme is even faster. So, the quantum threat is not an existential one for cryptography. We will have security.

According to Lyubashevsky, there is no reason to wait further before switching to lattice cryptography using CRYSTALS. The critical point would be not to hard-code the scheme you use but make it replaceable as a black box. In this way, you are prepared for when standardized quantum-safe schemes become eventually available.

The final front on which IBM is working regarding cryptography is fully homomorphic encryption, which brings the promise of enabling computing data while in its encrypted form. This makes away with the need to decrypt the data before processing it, which leaves it in a vulnerable and exposed state.

IBM FHE has made great advances from its inception to the initial implementation in 2011, which was painfully slow, to 2015, when it became possible to compare two fully encrypted genomes with FHE in less than an hour. FHE is today ready to be used by any companies, from small to large, says IBM.

Eric Maass, Strategy and emerging technology director at IBM, explains that FHE is made possible by some of the same lattice encryption techniques and mathematics used in CRYSTALS.

Adopting FHE in a more widespread manner has been historically complex not just in terms of the calculations that are performed on the data. It also requires a lot of computing power and the skills and learning curve have typically been very steep.

While confidential cryptography is a rather mature technology, homomorphic encryption and post-quantum cryptography are research fields that still attract lots of efforts. IBM is not the only company investing on homomorphic encryption. Microsoft, for instance, released SEAL (Simple Encrypted Arithmetic Library), and Google recently unveiled its Private Join and Compute tool. Similarly, a number of efforts towards quantum-safe computing are ongoing at several other companies, including Google, which selected NewHope, Microsoft, with PICNIC, and others.

Continue reading here:
The Pillars of Future Cryptography at IBM - InfoQ.com

Are quantum computers good at picking stocks? This project tried to find out – ZDNet

The researchers ran a model for portfolio optimization on Canadian company D-Wave's 2,000-qubit quantum annealing processor.

Consultancy firm KPMG, together with a team of researchers from the Technical University of Denmark (DTU) and a yet-to-be-named European bank, has been piloting the use of quantum computing to determine which stocks to buy and sell for maximum return, an age-old banking operation known as portfolio optimization.

The researchers ran a model for portfolio optimization on Canadian company D-Wave's 2,000-qubit quantum annealing processor, comparing the results to those obtained with classical means. They foundthat the quantum annealer performed better and faster than other methods, while being capable of resolving larger problems although the study also indicated that D-Wave's technology still comes with some issues to do with ease of programming and scalability.

The smart distribution of portfolio assets is a problem that stands at the very heart of banking. Theorized by economist Harry Markowitz as early as 1952, it consists of allocating a fixed budget to a collection of financial assets in a way that will produce as much return as possible over time. In other words, it is an optimization problem: an investor should look to maximize gain and minimize risk for a given financial portfolio.

SEE: Hiring Kit: Computer Hardware Engineer (TechRepublic Premium)

As the number of assets in the portfolio multiplies, the difficulty of the calculation exponentially increases, and the problem can quickly become intractable, even to the world's largest supercomputers. Quantum computing, on the other hand, offers the possibility of running multiple calculations at once thanks to a special quantum state that is adopted by quantum bits, or qubits.

Quantum systems, for now, cannot support enough qubits to have a real-world impact. But in principle, large-scale quantum computers could one day solve complex portfolio optimization problems in a matter of minutes which is why the world's largest banks are already putting their research team to work on developing quantum algorithms.

To translate Markowitz's classical model for the portfolio selection problem into a quantum algorithm, the DTU's researchers formulated the equation into a quantum model called a quadratic unconstrained binary optimization (QUBO) problem, which they based on the usual criteria used for the operation such as budget and expected return.

When deciding which quantum hardware to pick to test their model, the team was faced with a number of options: IBM and Google are both working on a superconducting quantum computer, while Honeywell and IonQ are building trapped-ion devices; Xanadu is looking at photonic quantum technologies, and Microsoft is creating a topological quantum system.

D-Wave's quantum annealing processor is yet another approach to quantum computing. Unlike other systems, which are gate-based quantum computers, it is not possible to control the qubits in a quantum annealer; instead, D-Wave's technology consists of manipulating the environment surrounding the system, and letting the device find a "ground state". In this case, the ground state corresponds to the most optimal portfolio selection.

This approach, while limiting the scope of the problems that can be resolved by a quantum annealer, also enable D-Wave to work with many more qubits than other devices. The company's latest devicecounts 5,000 qubits, while IBM's quantum computer, for example, supports less than 100 qubits.

The researchers explained that the maturity of D-Wave's technology prompted them to pick quantum annealing to trial the algorithm; and equipped with the processor, they were able to embed and run the problem for up to 65 assets.

To benchmark the performance of the processor, they also ran the Markowitz equation with classical means, called brute force. With the computational resources at their disposal, brute force could only be used for up to 25 assets, after which the problem became intractable for the method.

Comparing between the two methods, the scientists found that the quality of the results provided by D-Wave's processor was equal to that delivered by brute force proving that quantum annealing can reliably be used to solve the problem. In addition, as the number of assets grew, the quantum processor overtook brute force as the fastest method.

From 15 assets onwards, D-Wave's processor effectively started showing significant speed-up over brute force, as the problem got closer to becoming intractable for the classical computer.

To benchmark the performance of the quantum annealer for more than 25 assets which is beyond the capability of brute force the researchers compared the results obtained with D-Wave's processor to those obtained with a method called simulated annealing. There again, shows the study, the quantum processor provided high-quality results.

Although the experiment suggests that quantum annealing might show a computational advantage over classical devices, therefore, Ulrich Busk Hoff, researcher at DTU, who participated in the research, warns against hasty conclusions.

"For small-sized problems, the D-Wave quantum annealer is indeed competitive, as it offers a speed-up and solutions of high quality," he tells ZDNet. "That said, I believe that the study is premature for making any claims about an actual quantum advantage, and I would refrain from doing that. That would require a more rigorous comparison between D-Wave and classical methods and using the best possible classical computational resources, which was far beyond the scope of the project."

DTU's team also flagged some scalability issues, highlighting that as the portfolio size increased, there was a need to fine-tune the quantum model's parameters in order to prevent a drop in results quality. "As the portfolio size was increased, a degradation in the quality of the solutions found by quantum annealing was indeed observed," says Hoff. "But after optimization, the solutions were still competitive and were more often than not able to beat simulated annealing."

SEE: The EU wants to build its first quantum computer. That plan might not be ambitious enough

In addition, with the quantum industry still largely in its infancy, the researchers pointed to the technical difficulties that still come with using quantum technologies. Implementing quantum models, they explained, requires a new way of thinking; translating classical problems into quantum algorithms is not straightforward, and even D-Wave's fairly accessible software development kit cannot be described yet as "plug-and-play".

The Canadian company's quantum processor nevertheless shows a lot of promise for solving problems such as portfolio optimization. Although the researchers shared doubts that quantum annealing would have as much of an impact as large-scale gate-based quantum computers, they pledged to continue to explore the capabilities of the technology in other fields.

"I think it's fair to say that D-Wave is a competitive candidate for solving this type of problem and it is certainly worthwhile further investigation," says Hoff.

KPMG, DTU's researchers and large banks are far from alone in experimenting with D-Wave's technology for near-term applications of quantum computing. For example, researchers from pharmaceutical company GlaxoSmithKline (GSK) recently trialed the use of different quantum methods to sequence gene expression, and found that quantum annealingcould already compete against classical computersto start addressing life-sized problems.

Continued here:
Are quantum computers good at picking stocks? This project tried to find out - ZDNet

The key to making AI green is quantum computing – The Next Web

Weve painted ourselves into another corner with artificial intelligence. Were finally starting to breakthrough the usefulness barrier but were butting up against the limits of our our ability to responsibly meet our machines massive energy requirements.

At the current rate of growth, it appears well have to turn Earth into Coruscant if we want to keep spending unfathomable amounts of energy training systems such as GPT-3 .

The problem: Simply put, AI takes too much time and energy to train. A layperson might imagine a bunch of code on a laptop screen when they think about AI development, but the truth is that many of the systems we use today were trained on massive GPU networks, supercomputers, or both. Were talking incredible amounts of power. And, worse, it takes a long time to train AI.

The reason AI is so good at the things its good at, such as image recognition or natural language processing, is because it basically just does the same thing over and over again, making tiny changes each time, until it gets things right. But were not talking about running a few simulations. It can take hundreds or even thousands of hours to train up a robust AI system.

One expert estimated that GPT-3, a natural language processing system created by OpenAI, would cost about $4.6 million to train. But that assumes one-shot training. And very, very few powerful AI systems are trained in one fell swoop. Realistically, the total expenses involved in getting GPT-3 to spit out impressively coherent gibberish are probably in the hundreds-of-millions.

GPT-3 is among the high-end abusers, but there are countless AI systems out there sucking up hugely disproportionate amounts of energy when compared to standard computation models.

The problem? If AI is the future, under the current power-sucking paradigm, the future wont be green. And that may mean we simply wont have a future.

The solution: Quantum computing.

An international team of researchers, including scientists from the University of Vienna, MIT, Austria, and New York, recentlypublishedresearch demonstrating quantum speed-up in a hybrid artificial intelligence system.

In other words: they managed to exploit quantum mechanics in order to allow AI to find more than one solution at the same time. This, of course, speeds up the training process.

Per the teams paper:

The crucial question for practical applications is how fast agents learn. Although various studies have made use of quantum mechanics to speed up the agents decision-making process, a reduction in learning time has not yet been demonstrated.

Here we present a reinforcement learning experiment in which the learning process of an agent is sped up by using a quantum communication channel with the environment. We further show that combining this scenario with classical communication enables the evaluation of this improvement and allows optimal control of the learning progress.

How?

This is the cool part. They ran 10,000 models through 165 experiments to determine how they functioned using classical AI and how they functioned when augmented with special quantum chips.

And by special, that is to say, you know how classical CPUs process via manipulation of electricity? The quantum chips the team used were nanophotonic, meaning they use light instead of electricity.

The gist of the operation is that in circumstance where classical AI bogs down solving very difficult problems (think: supercomputer problems), they found thehybrid-quantum system outperformed standard models.

Interestingly, when presented with less difficult challenges, the researchers didnt not observe anyperformance boost. Seems like you need to get it into fifth gear before you kick in the quantum turbocharger.

Theres still a lot to be done before we can roll out the old mission accomplished banner. The teams work wasnt the solution were eventually aiming for, but more of a small-scale model of how it could work once we figure out how to apply their techniques to larger, real problems.

You can read the whole paper here on Nature.

H/t: Shelly Fan, Singularity Hub

Published March 17, 2021 19:41 UTC

See the article here:
The key to making AI green is quantum computing - The Next Web