Archive for the ‘Quantum Computer’ Category

Honeywell Takes Quantum Leap. The Apple of Quantum Computing Is Here. – Barron’s

Text size

Honeywell International and Cambridge Quantum Computing are merging their fledgling quantum-computing businesses into a stand-alone company, signaling that quantum computing is just about ready for prime time.

The deal, essentially, combines Honeywells (ticker: HON) quantum hardware expertise with privately held Cambridges software and algorithms. It is as if the two had formed the Apple (AAPL) of the quantum computing world, in that Apple makes hardware, operating systems, and software applications.

This is an inflection point company that will drive the future of quantum computing, said Tony Uttley, currently the president of Honeywells quantum business. He will be president of the new company.

Honeywell says quantum computing can be a trillion-dollar-a-year industry some day, just like smartphones, although for now, the smartphone market is some 2,000 times bigger. Moving now, at the point before the gap begins to close, could be a win.

We are at a [industry] phase where people are looking to hear more about practical quantum use cases and investors want to know if this is investible, said Daniel Newman, founder of Futurum, a research and advisory firm focused on digital innovation and market-disrupting technologies.

This deal will speed the process of investor education. The new business is targeting $1 billion in annual revenue in the next two to four years. Wed be disappointed if we were only at a billion in a few years, said Ilyas Khan, Cambridges CEO and founder. He will be CEO of the new company, which he said will decide whether to pursue an initial public offering by the end of the year.

A name for the business has yet to be chosen.

The new company plans to have commercial products as soon as late 2021. The initial offerings will be in web security, with products such as unhackable passwords. Down the road, there are commercial applications in chemicals and drug development.

In terms of sheer brainpower the new enterprise is impressive. It will have about 350 employees, including 200 scientists, 120 of them with doctorate degrees.

The company will start off with a cash injection of about $300 million from Honeywell. The industrial giant will own about 54% of the new company for contributing its cash and technology.

Honeywell stock isnt reacting to the news. Quantum computing is still too small to move the needle for a $160 billion conglomerate. Shares were down slightly in early Tuesday trading, similar to moves in the S&P 500 and Dow Jones Industrial Average.

Year to date, Honeywell stock has gained 7%.

Write to Al Root at allen.root@dowjones.com

Continued here:
Honeywell Takes Quantum Leap. The Apple of Quantum Computing Is Here. - Barron's

BBVA and Zapata Computing Release Study Showing the Potential to Speed Up Monte Carlo Calculations for – GlobeNewswire

The research proposes novel circuit designs that significantly reduce the resources needed to gain a quantum advantage in derivative pricing calculations

BOSTON, June 09, 2021 (GLOBE NEWSWIRE) -- Zapata Computing, a leading enterprise software company for quantum-classical applications, today announced the results of a research project conducted with the global bank BBVA. The projects aim was to identify challenges and opportunities for quantum algorithms to speed up Monte Carlo simulations in finance. Monte Carlo simulations are commonly used for credit valuation adjustment (CVA) and derivative pricing. The research proposes novel circuit designs that significantly reduce the resources needed to gain a practical quantum advantage in derivative calculations, taking years off the projected timeline for the day when financial institutions can generate real value from quantum computers.

Fueled by regulatory pressure to minimize systemic financial risk since the global financial crisis of 2008, banks and other financial institutions have been increasingly focused on accounting for credit risk in derivative pricing. In the US, similar regulation exists to stress-test financial scenarios for Comprehensive Capital Analysis andReview (CCAR) and Dodd-Frank compliance. Monte Carlo simulation is the standard approach for this type of risk analysis, but the calculations required which must account for all possible credit default scenarios are immensely complex and prohibitively time-consuming for classical computers. Zapata and BBVAs research reveals practical ways for quantum algorithms to speed up the Monte Carlo simulation process.

Our innovative approach to quantum-accelerated Monte Carlo methods uses a novel form of amplitude estimation, combined with additional improvements that make the quantum circuit much shallower, in some cases hundreds of times shallower than the well-known alternatives in the literature, said Yudong Cao, CTO and founder of Zapata Computing. This approach reduces the time needed for a quantum computer to complete the CVA calculation by orders of magnitude, and also dramatically reduces the number of qubits needed to gain a quantum advantage over classical methods. Zapata highlights that, in their enterprise customer collaborations, they perform in-depth studies of how much quantum computing resource will be required to obtain practical benefit for business operations. This type of in-depth research can directly inform the hardware specifications needed for quantum advantage in specific use cases.

Improving the performance of these calculations in realistic settings will have a direct impact on the technological resources and costs required for financial risk management, said Andrea Cadarso, BBVA Mexicos Team Lead for Quantitative & Business Solutions. The implications of this research are not limited to CVA calculations. We intend to extend our approach to other applications in quantitative finance, where Monte Carlo simulations are widely used for everything from policy making and risk assessment to financial product pricing calculations.

The BBVA-Zapata Computing joint publication is the result of one in a series of research initiatives thatBBVA Research & Patents launched in 2019. These projects, conducted in partnership with leading institutions and companies including Spanish National Research Council, Multiverse, Fujitsu and Accenture, explore the potential advantages of applying quantum computing in the financial sector.

Escolstico Snchez, leader of the Research & Patents discipline at BBVA, emphasized BBVA's intention to continue exploring this cutting-edge technology: BBVA is fully committed to its work in the quantum area. The bank has assembled a quantum team and is getting professionals from different areas involved in the development of a set of quantum solutions that meet the bank's needs.

About Zapata ComputingZapata Computing, Inc. builds quantum-ready applications for enterprise deployment using our flagship product Orquestra. Zapata has pioneered a new quantum-classical development and deployment paradigm that focuses on a range of use cases, including ML, optimization and simulation. Orquestra integrates best-in-class quantum and classical technologies including Zapatas leading-edge algorithms, open-source libraries in Python, and more. Zapata partners closely with hardware providers across the quantum ecosystem such as Amazon, Google, Honeywell, IBM, IonQ, Microsoft and Rigetti. Investors in Zapata include Comcast Ventures, BASF Venture Capital, Honeywell Ventures, Itochu Corporation, Merck Global Health and Robert Bosch Venture Capital.

Media Contact:Anya NelsonScratch Marketing + Media for Zapata Computinganyan@scratchmm.com617.817.6559

Originally posted here:
BBVA and Zapata Computing Release Study Showing the Potential to Speed Up Monte Carlo Calculations for - GlobeNewswire

Why Is Quantum Computing So Hard to Explain – Quanta Magazine

Quantum computers, you might have heard, are magical uber-machines that will soon cure cancer and global warming by trying all possible answers in different parallel universes. For 15 years, on my blog and elsewhere, Ive railed against this cartoonish vision, trying to explain what I see as the subtler but ironically even more fascinating truth. I approach this as a public service and almost my moral duty as a quantum computing researcher. Alas, the work feels Sisyphean: The cringeworthy hype about quantum computers has only increased over the years, as corporations and governments have invested billions, and as the technology has progressed to programmable 50-qubit devices that (on certain contrived benchmarks) really can give the worlds biggest supercomputers a run for their money. And just as in cryptocurrency, machine learning and other trendy fields, with money have come hucksters.

In reflective moments, though, I get it. The reality is that even if you removed all the bad incentives and the greed, quantum computing would still be hard to explain briefly and honestly without math. As the quantum computing pioneer Richard Feynman once said about the quantum electrodynamics work that won him the Nobel Prize, if it were possible to describe it in a few sentences, it wouldnt have been worth a Nobel Prize.

Not that thats stopped people from trying. Ever since Peter Shor discovered in 1994 that a quantum computer could break most of the encryption that protects transactions on the internet, excitement about the technology has been driven by more than just intellectual curiosity. Indeed, developments in the field typically get covered as business or technology stories rather than as science ones.

That would be fine if a business or technology reporter could truthfully tell readers, Look, theres all this deep quantum stuff under the hood, but all you need to understand is the bottom line: Physicists are on the verge of building faster computers that will revolutionize everything.

The trouble is that quantum computers will not revolutionize everything.

Yes, they might someday solve a few specific problems in minutes that (we think) would take longer than the age of the universe on classical computers. But there are many other important problems for which most experts think quantum computers will help only modestly, if at all. Also, while Google and others recently made credible claims that they had achieved contrived quantum speedups, this was only for specific, esoteric benchmarks (ones that I helped develop). A quantum computer thats big and reliable enough to outperform classical computers at practical applications like breaking cryptographic codes and simulating chemistry is likely still a long way off.

But how could a programmable computer be faster for only some problems? Do we know which ones? And what does a big and reliable quantum computer even mean in this context? To answer these questions we have to get into the deep stuff.

Lets start with quantum mechanics. (What could be deeper?) The concept of superposition is infamously hard to render in everyday words. So, not surprisingly, many writers opt for an easy way out: They say that superposition means both at once, so that a quantum bit, or qubit, is just a bit that can be both 0 and 1 at the same time, while a classical bit can be only one or the other. They go on to say that a quantum computer would achieve its speed by using qubits to try all possible solutions in superposition that is, at the same time, or in parallel.

This is what Ive come to think of as the fundamental misstep of quantum computing popularization, the one that leads to all the rest. From here its just a short hop to quantum computers quickly solving something like the traveling salesperson problem by trying all possible answers at once something almost all experts believe they wont be able to do.

The thing is, for a computer to be useful, at some point you need to look at it and read an output. But if you look at an equal superposition of all possible answers, the rules of quantum mechanics say youll just see and read a random answer. And if thats all you wanted, you couldve picked one yourself.

What superposition really means is complex linear combination. Here, we mean complex not in the sense of complicated but in the sense of a real plus an imaginary number, while linear combination means we add together different multiples of states. So a qubit is a bit that has a complex number called an amplitude attached to the possibility that its 0, and a different amplitude attached to the possibility that its 1. These amplitudes are closely related to probabilities, in that the further some outcomes amplitude is from zero, the larger the chance of seeing that outcome; more precisely, the probability equals the distance squared.

But amplitudes are not probabilities. They follow different rules. For example, if some contributions to an amplitude are positive and others are negative, then the contributions can interfere destructively and cancel each other out, so that the amplitude is zero and the corresponding outcome is never observed; likewise, they can interfere constructively and increase the likelihood of a given outcome. The goal in devising an algorithm for a quantum computer is to choreograph a pattern of constructive and destructive interference so that for each wrong answer the contributions to its amplitude cancel each other out, whereas for the right answer the contributions reinforce each other. If, and only if, you can arrange that, youll see the right answer with a large probability when you look. The tricky part is to do this without knowing the answer in advance, and faster than you could do it with a classical computer.

Twenty-seven years ago, Shor showed how to do all this for the problem of factoring integers, which breaks the widely used cryptographic codes underlying much of online commerce. We now know how to do it for some other problems, too, but only by exploiting the special mathematical structures in those problems. Its not just a matter of trying all possible answers at once.

Compounding the difficulty is that, if you want to talk honestly about quantum computing, then you also need the conceptual vocabulary of theoretical computer science. Im often asked how many times faster a quantum computer will be than todays computers. A million times? A billion?

This question misses the point of quantum computers, which is to achieve better scaling behavior, or running time as a function of n, the number of bits of input data. This could mean taking a problem where the best classical algorithm needs a number of steps that grows exponentially with n, and solving it using a number of steps that grows only as n2. In such cases, for small n, solving the problem with a quantum computer will actually be slower and more expensive than solving it classically. Its only as n grows that the quantum speedup first appears and then eventually comes to dominate.

But how can we know that theres no classical shortcut a conventional algorithm that would have similar scaling behavior to the quantum algorithms? Though typically ignored in popular accounts, this question is central to quantum algorithms research, where often the difficulty is not so much proving that a quantum computer can do something quickly, but convincingly arguing that a classical computer cant. Alas, it turns out to be staggeringly hard to prove that problems are hard, as illustrated by the famous P versus NP problem (which asks, roughly, whether every problem with quickly checkable solutions can also be quickly solved). This is not just an academic issue, a matter of dotting is: Over the past few decades, conjectured quantum speedups have repeatedly gone away when classical algorithms were found with similar performance.

Note that, after explaining all this, I still havent said a word about the practical difficulty of building quantum computers. The problem, in a word, is decoherence, which means unwanted interaction between a quantum computer and its environment nearby electric fields, warm objects, and other things that can record information about the qubits. This can result in premature measurement of the qubits, which collapses them down to classical bits that are either definitely 0 or definitely 1. The only known solution to this problem is quantum error correction: a scheme, proposed in the mid-1990s, that cleverly encodes each qubit of the quantum computation into the collective state of dozens or even thousands of physical qubits. But researchers are only now starting to make such error correction work in the real world, and actually putting it to use will take much longer. When you read about the latest experiment with 50 or 60 physical qubits, its important to understand that the qubits arent error-corrected. Until they are, we dont expect to be able to scale beyond a few hundred qubits.

Once someone understands these concepts, Id say theyre ready to start reading or possibly even writing an article on the latest claimed advance in quantum computing. Theyll know which questions to ask in the constant struggle to distinguish reality from hype. Understanding this stuff really is possible after all, it isnt rocket science; its just quantum computing!

See the article here:
Why Is Quantum Computing So Hard to Explain - Quanta Magazine

With cyberattacks on the rise, organizations are already bracing for devastating quantum hacks – CNBC

Amidst the houses and the car parks sits GCHQ, the Government Communications Headquarters, in this aerial photo taken on October 10, 2005.

David Goddard | Getty Images

LONDON A little-known U.K. company called Arqit is quietly preparing businesses and governments for what it sees as the next big threat to their cyber defenses: quantum computers.

It's still an incredibly young field of research, however some in the tech industry including the likes of Google, Microsoft and IBM believe quantum computing will become a reality in the next decade. And that could be worrying news for organizations' cyber security.

David Williams, co-founder and chairman of Arqit, says quantum computers will be several millions of times faster than classical computers, and would be able to break into one of the most widely-used methods of cryptography.

"The legacy encryption that we all use to keep our secrets safe is called PKI," or public-key infrastructure, Williams told CNBC in an interview. "It was invented in the 70s."

"PKI was originally designed to secure the communications of two computers," Williams added. "It wasn't designed for a hyper-connected world where there are a billion devices all over the world communicating in a complex round of interactions."

Arqit, which is planning to go public via a merger with a blank-check company, counts the likes of BT, Sumitomo Corporation, the British government and the European Space Agency as customers. Some of its team previously worked for GCHQ, the U.K. intelligence agency. The firm only recently came out of "stealth mode" a temporary state of secretness and its stock market listing couldn't be more timely.

The past month has seen a spate of devastating ransomware attacks on organizations from Colonial Pipeline, the largest fuel pipeline in the U.S., to JBS, the world's largest meatpacker.

Microsoft and several U.S. government agencies, meanwhile, were among those affected by an attack on IT firm SolarWinds. President Joe Biden recently signed an executive order aimed at ramping up U.S. cyber defenses.

Quantum computing aims to apply the principles of quantum physics a body of science that seeks to describe the world at the level of atoms and subatomic particles to computers.

Whereas today's computers use ones and zeroes to store information, a quantum computer relies on quantum bits, or qubits, which can consist of a combination of ones and zeroes simultaneously, something that's known in the field as superposition. These qubits can also be linked together through a phenomenon called entanglement.

Put simply, it means quantum computers are far more powerful than today's machines and are able to solve complex calculations much faster.

Kasper Rasmussen, associate professor of computer science at the University of Oxford, told CNBC that quantum computers are designed to do "certain very specific operations much faster than classical computers."

That it is not to say they'll be able to solve every task. "This is not a case of: 'This is a quantum computer, so it just runs whatever application you put on there much faster.' That's not the idea," Rasmussen said.

This could be a problem for modern encryption standards, according to experts.

"When you and I use PKI encryption, we do halves of a difficult math problem: prime factorisation," Williams told CNBC. "You give me a number and I work out what are the prime numbers to work out the new number. A classic computer can't break that but a quantum computer will."

Williams believes his company has found the solution. Instead of relying on public-key cryptography, Arqit sends out symmetric encryption keys long, random numbers via satellites, something it calls "quantum key distribution." Virgin Orbit, which invested in Arqit as part of its SPAC deal, plans to launch the satellites from Cornwall, England, by 2023.

Some experts say it will take some time before quantum computers finally arrive in a way that could pose a threat to existing cyber defenses. Rasmussen doesn't expect them to exist in any meaningful way for at least another 10 years. But he's not complacent.

"If we accept the fact that quantum computers will exist in 10 years, anyone with the foresight to record important conversations now might be in a position to decrypt them when quantum computers come about," Rasmussen said.

"Public-key cryptography is literally everywhere in our digitized world, from your bank card, to the way you connect to the internet, to your car key, to IOT (internet of things) devices," Ali Kaafarani, CEO and founder of cybersecurity start-up PQShield, told CNBC.

The U.S. Commerce Department's National Institute of Standards and Technology is looking to update its standards on cryptography to include what's known as post-quantum cryptography, algorithms that could be secure against an attack from a quantum computer.

Kaafarani expects NIST will decide on new standards by the end of 2021. But, he warns: "For me, the challenge is not the quantum threat and how can we build encryption methods that are secure. We solved that."

"The challenge now is how businesses need to prepare for the transition to the new standards," Kaafarani said. "Lessons from the past prove that it's too slow and takes years and decades to switch from one algorithm to another."

Williams thinks firms need to be ready now, adding that forming post-quantum algorithms that take public-key cryptography and make it "even more complex" are not the solution. He alluded to a report from NIST which noted challenges with post-quantum cryptographic solutions.

Read more here:
With cyberattacks on the rise, organizations are already bracing for devastating quantum hacks - CNBC

The ‘second quantum revolution’ is almost here. We need to make sure it benefits the many, not the few – The Conversation AU

Over the past six years, quantum science has noticeably shifted, from the domain of physicists concerned with learning about the universe on extremely small scales, to a source of new technologies we all might use for practical purposes. These technologies make use of quantum properties of single atoms or particles of light. They include sensors, communication networks, and computers.

Quantum technologies are expected to impact many aspects of our society, including health care, financial services, defence, weather modelling, and cyber security. Clearly, they promise exciting benefits. Yet the history of technology development shows we cannot simply assume new tools and systems will automatically be in the public interest.

We must look ahead to what a quantum society might entail and how the quantum design choices made today might impact how we live in the near future. The deployment of artificial intelligence and machine learning over the past few years provides a compelling example of why this is necessary.

Lets consider an example. Quantum computers are perhaps the best-known quantum technology, with companies like Google and IBM competing to achieve quantum computation. The advantage of quantum computers lies in their ability to tackle incredibly complex tasks that would take a normal computer millions of years. One such task is simulating molecules behaviour to improve predictions about the properties of prospective new drugs and accelerate their development.

One conundrum posed by quantum computing is the sheer expense of investing in the physical infrastructure of the technology. This means ownership will likely be concentrated among the wealthiest countries and corporations. In turn, this could worsen uneven power distribution enabled by technology.

Other considerations for this particular type of quantum technology include concerns about reduced online privacy.

How do we stop ourselves blundering into a quantum age without due forethought? How do we tackle the societal problems posed by quantum technologies, while nations and companies race to develop them?

Last year, CSIRO released a roadmap that included a call for quantum stakeholders to explore and address social risks. An example of how we might proceed with this has begun at the World Economic Forum (WEF). The WEF is convening experts from industry, policy-making, and research to promote safe and secure quantum technologies by establishing an agreed set of ethical principles for quantum computing.

Australia should draw on such initiatives to ensure the quantum technologies we develop work for the public good. We need to diversify the people involved in quantum technologies in terms of the types of expertise employed and the social contexts we work from so we dont reproduce and amplify existing problems or create new ones.

Read more: Scientists want to build trust in science and technology. The alternative is too risky to contemplate

While we work to shape the impacts of individual quantum technologies, we should also review the language used to describe this second quantum revolution.

The rationale most commonly used to advocate for the field narrowly imagines public benefit of quantum technologies in terms of economic gain and competition between nations and corporations. But framing this as a race to develop quantum technologies means prioritising urgency, commercial interests and national security at the expense of more civic-minded concerns.

Its still early enough to do something about the challenges posed by quantum technologies. Its also not all doom and gloom, with a variety of initiatives and national research and development policies setting out to tackle these problems before they are set in stone.

We need discussions involving a cross-section of society on the potential impacts of quantum technologies on society. This process should clarify societal expectations for the emerging quantum technology sector and inform any national quantum initiative in Australia.

Read more: Why are scientists so excited about a recently claimed quantum computing milestone?

Continue reading here:
The 'second quantum revolution' is almost here. We need to make sure it benefits the many, not the few - The Conversation AU