Archive for the ‘Quantum Computing’ Category

Quantum material research connecting physicists in Hong Kong, Beijing and Shanghai facilitates discovery of better materials that benefit our society…

A joint research team from the University of Hong Kong (HKU), Institute of Physics at Chinese Academy of Science, Songshan Lake Materials Laboratory, Beihang University in Beijing and Fudan University in Shanghai, has provided a successful example of modern era quantum material research. By means of the state-of-art quantum many-body simulations, performed on the worlds fastest supercomputers (Tianhe-I and Tianhe-III protype at National Supercomputer Center in Tianjin and Tianhe-II at National Supercomputer Center in Guangzhou), they achieved accurate model calculations for a rare-earth magnet TmMgGaO4 (TMGO). They found that the material, under the correct temperature regime, could realise the the long-sought-after two-dimensional topological Kosterlitz-Thouless (KT) phase, which completed the pursuit of identifying the KT physics in quantum magnetic materials for half a century. The research work has been published in Nature Communications.

Quantum materials are becoming the cornerstone of the continuous prosperity of human society. From the next-generation AI computing chips that go beyond Moores law (the law is the observation that the number of transistors in a dense integrated circuit doubles about every two years, our PCs and smartphones are all based on the success of it. Nevertheless, as the size of the transistors are becoming smaller to the scale of nanometer, the behaviour of electrons are subject to quantum mechanics, Moores law is expected to breakdown very soon), to the high speed Maglev train and the topological unit for quantum computers, investigations along these directions all belong to the arena of quantum material research.

However, such research is by no means easy. The difficulty lies in the fact that scientists have to solve the millions of thousands of the electrons in the material in a quantum mechanical way (hence quantum materials are also called quantum many-body systems), this is far beyond the time of paper and pencil, and requires instead modern quantum many-body computational techniques and advanced analysis. Thanks to the fast development of the supercomputing platforms all over the world, scientists and engineers are now making great use of these computation facilities and advanced mathematical tools to discover better materials to benefit our society.

The research is inspired by the KT phase theory avocated by J Michael Kosterlitz, David J Thouless and F Duncan M Haldane, laureates of the Nobel Prize in Phyiscs 2016. They were awarded for their theoretical discoveries of topological phase and phase transitions of matter. Topology is a new way of classifying and predicting the properties of materials in condensed matter physics, and is now becoming the main stream of quantum material research and industry, with broad potential applications in quantum computing, lossless transmission of signals for information technology, etc. Back in the 1970s, Kosterlitz and Thouless had predicted the existence of topological phase, hence named after them as the KT phase, in quantum magnetic materials. However, although such phenomena have been found in superfluids and superconductors, KT phase has yet been realised in bulk magnetic material.

The joint team is led by Dr Zi Yang Meng from HKU, Dr Wei Li from Beihang Univeristy and Professor Yang Qi from Fudan University. Their joint effort has revealed the comprehensive properties of the material TMGO. For example, in Figure 2, by self-adjustable tensor network calculation, they computed the properties of the model system at different temperatures, magnetic field, and by comparing with the corresponding experimental results of the material, they identified the correct microscopic model parameters. With the correct microscopic model on hand, they then performed quantum Monte Carlo simulation and obtained the neutron scattering magnetic spectra at different temperatures (neutron scattering is the established detection method for material structure and their magnetic properties, the closest such facility to Hong Kong is the China Spallation Neutron Source in Dongguan, Guangdong). As shown in Figure 3, the magnetic spectra with its unique signature at the M point is the dynamical fingerprint of the topological KT phase that has been proposed more than half-a-century ago.

This research work provides the missing piece of topological KT phenomena in the bulk magnetic materials, and has completed the half-a-century pursuit which eventually leads to the Nobel Physics Prize of 2016. Since the topological phase of matter is the main theme of condensed matter and quantum material research nowadays, it is expected that this work will inspire many follow-up theoretical and experimental researches, and in fact, promising results for further identification of the topological properties in quantum magnet have been obtained among the joint team and our collaborators, said Dr Meng.

Dr Meng added: The joint team research across Hong Kong, Beijing and Shanghai also sets up the protocol of modern quantum material research, such protocol will certainly lead to more profound and impactful discoveries in quantum materials. The computation power of our smartphone nowadays is more powerful than the supercomputers 20 years ago, one can optimistically foresee that with the correct quantum material as the building block, personal devices in 20 years time can certainly be more powerful than the fastest supercomputers right now, with minimal energy cost of everyday battery.

Read more here:
Quantum material research connecting physicists in Hong Kong, Beijing and Shanghai facilitates discovery of better materials that benefit our society...

Archer touts performing early-stage validation of quantum computing chip – ZDNet

Archer staff operating the specialised conduction atomic force microscopy instrumentation required to perform the measurements.

Archer Materials has announced a milestone in its race to build a room-temperature quantum computing quantum bit (qubit) processor, revealing it has successfully performed its first measurement on a single qubit component.

"We have successfully performed our first measurement on a single qubit component, which is the most important component, marking a significant period moving forward in the development of Archer's 12CQ quantum computing chip technology," CEO Dr Mohammad Choucair said.

"Building and operating the 12CQ chip requires measurements to be successfully performed at the very limits of what can be achieved technologically in the world today."

See also:Australia's ambitious plan to win the quantum race

Choucair said directly proving room-temperature conductivity of the 12CQ chip qubit component advances Archer's development towards a working chip prototype.

Archer said conductivity measurements on single qubit components were carried out using conductive atomic force microscopy that was configured using "state-of-the-art instrumentation systems", housed in a semiconductor prototype foundry cleanroom.

"The measurements directly and unambiguously proved, with nanometre-scale precision, the conductivity of single qubits at room-temperature in ambient environmental conditions (e.g. in the presence of air, moisture, and at normal atmospheric pressures," Archer said in a statement.

It said the measurements progress its technological development towards controlling quantum information that reside on individual qubits, which is a key componentry requirement for a working quantum computing qubit processor.

Another key component is readout.

"Control must be performed prior to readout, as these subsequent steps represent a logical series in the 12CQ quantum computing chip function," Archer wrote.

See also: What is quantum computing? Understanding the how, why and when of quantum computers

In announcing last week it was progressing work on its graphene-based biosensor technology, Archer said it was focusing on establishing commercial partnerships to bring its work out of the lab and convert it into viable products.

Archer on Monday said it intends to develop the 12CQ chip to be sold directlyand have the intellectual property rights to the chip technology licensed.

"The technological significance of the work is inherently tied to the commercial viability of the 12CQ technology. The room-temperature conductivity potentially enables direct access to the quantum information stored in the qubits by means of electrical current signals on-board portable devices, which require conducting materials to operate, for both control and readout," Choucair added.

He said the intrinsic materials feature of conductivity in Archer's qubit material down to the single qubit level represents a "significant commercial advantage" over competing qubit proposals that rely on insulating materials, such as diamond-based materials or photonic qubit architectures.

More here:
Archer touts performing early-stage validation of quantum computing chip - ZDNet

The technical realities of functional quantum computers – is Googles ten-year plan for Quantum Computing viable? – Diginomica

In March, I explored the enterprise readiness of quantum computing in Quantum computing is right around the corner, but cooling is a problem. What are the options? I also detailed potential industry use cases, from supply chain to banking and finance. But what are the industry giants pursuing?

Recently, I listened to two somewhat different perspectives on quantum computing. One is Googles (public) ten-year plan.

Google plans to search for commercially viable applications in the short term, but they dont think there will be many for another ten years - a time frame I've heard one referred to as bound but loose. What that meant was, no more than ten, maybe sooner. In the industry, the term for the current state of the art is NISQ Noisy, Interim Scale Quantum Computing.

The largest quantum computers are in the 50-70 qubit range, and Google feels NISQ has a ceiling of maybe two hundred. The "noisy" part of NISQ is because the qubits need to interact and be nearby. That generates noise. The more qubits, the more noise, and the more challenging it is to control the noise.

But Google suggests the real unsolved problems in fields like optimization, materials science, chemistry, drug discovery, finance, and electronics will take machines with thousands of qubits and even envision one million on a planar array etched in aluminum. Major problems need solving such noise elimination, coherence, and lifetime (a qubit holds its position in a tiny time slice).

In the meantime, Google is seeking customers to work with them to find applications working with Google researchers. Quantum computing needs algorithms as much as it needs qubits. It requires customers with a strong in-house science team and a commitment of three years. Whatever is discovered will be published as open source.

In summary, Google does not see commercial value in NISQ. They are using NISQ to discover what quantum computing can do that has any commercial capability.

First of all, if you have a picture in your mind of a quantum computer, chances are you are not including an essential element a conventional computer. According toQuantum Computing, Progress, and Prospects:

Although reports in the popular press tend to focus on the development of qubits and the number of qubits in the current prototypical quantum computing chip, any quantum computer requires an integrated hardware approach using significant conventional hardware to enable qubits to be controlled, programmed, and read out.

The author is undoubtedly correct. Most material about quantum computers never mentions this, and it raises quite a few issues that can potentially dilute the gee-whiz aspect. I'd heard this first from Itamar Sivan, Ph.D., CEO, Quantum Machines. He followed with the quip that technically, quantum computers aren't computers. Its that simple. They are not Turing Machines. File this under the category of "You're Not Too Old to Learn Something New.

From (Hindi) Theory of Computation - Turing Machine:

A Turing machine is a mathematical model of computation that defines an abstract machine, which manipulates symbols on a strip of tape according to a table of rules. Despite the model's simplicity, given any computer algorithm, a Turing machine capable of simulating that algorithm's logic can be constructed.

Dr. Sivan clarified this as follows:

Any computer to ever be used, from the early-days computers, to massive HPCs, are all Turing-machines, and are thereforeequivalent to one another. All computers developedand manufactured in the last decades, are all merelybigger and more compact variations of one another. A quantum computer however is not MERELY a more advanced Turing machine, it is a different type of machine, and classical Turing machines are not equivalent to quantum computers as they are equivalent to one another.

Therefore, the complexity of running particular algorithms on quantum computers is different from the complexity of running them on classical machines. Just to make it clear, a quantum computer can be degenerated to behave like a classical computer, but NOT vice-versa.

There is a lot more to this concept, but most computers you've ever seen or heard of are Turing Machines, except Quantum computers. This should come as no surprise because anything about quantum mechanics is weird and counter-intuitive, so why would a quantum computer be any different?

According to Sivan, a quantum computer needs three elements to perform: a quantum computer and an orchestration platform of (conventional) hardware and software. There is no software in a quantum computer. The platform manages the progress of their algorithm through, mostly laser beams pulses. The logic needed to operate the quantum computer resides with and is controlled by the orchestration platform.

The crucial difference in Google's and Quantum Machines' strategy is that Google views the current NISQ state of affairs as a testbed for finding algorithms and applications for future development. At the same time, Sivan and his company produced an orchestration platform to put the current technology in play. Their platform is quantum computer agnostic it can operate with any of them. Sivan feels that focusing solely on the number of qubits is just part of the equation. According to Dr. Sivan:

While today's most advanced quantum computers only have a relatively small number of available qubits (53 for IBM's latest generation and 54 for Google's Sycamore processor), we cannot maximize the potential of even this relatively small count. We are leaving a lot on the table with regards to what we can already accomplish with the computing power we already have. While we should continue to scale up the number of qubits, we also need to focus on maximizing what we already have.

Ive asked a few quantum computer scientists if quantum computers can solve the Halting Problem.In Wikipedia:

The halting problem is determining, from a description of an arbitrarycomputer programand an input, whether the program will finish running, or continue to run forever.Alan Turingproved in 1936 that a generalalgorithmto solve the halting problem for all possible program-input pairs could not exist.

That puts it in a class of problems that are undecidable. Oddly, opinion was split onthequestion, despite Turings Proof. Like Simplico said to Galileo inDialogues Concerning Two New Sciences, If Aristotle had not said otherwise I would have believed it.

There are so many undecidable problems in math that I wondered if some of these might fall out.For example, straight from current AI problems, Planning in aPartially observable Markov decision process is considered undecidable. A million qubits? Maybe not. After all, Dr. Sivan pointed out that toreplicate in a classical processor, the information in just a 300 qubit quantum processor would require more transistors than all of the atoms inthe universe.

I've always believed that action speaks louder than words. While Google is taking the long view, Quantum Machines provides the platform to see how far we can go with current technology. Googles tactics are familiar. Every time you use TensorFlow, it gets better. Every time play with their autonomous car, it gets better. Their collaboration with a dozen or so technically advanced companies makes their quantum technology better.

Originally posted here:
The technical realities of functional quantum computers - is Googles ten-year plan for Quantum Computing viable? - Diginomica

European quantum computing startup takes its funding to 32M with fresh raise – TechCrunch

IQM Finland Oy (IQM), a European startup which makes hardware for quantum computers, has raised a 15M equity investment round from the EIC Accelerator program for the development of quantum computers. This is in addition to a raise of 3.3M from the Business Finland government agency. This takes the companys funding to over 32M. The company previously raised a 11.4M seed round.

IQM has hired a lot of engineers in its short life, and now says it plans to hire one quantum engineer per week on the pathway to commercializing its technology through the collaborative design of quantum-computing hardware and applications.

Dr. Jan Goetz, CEO and co-founder of IQM said: Quantum computers will be funded by European governments, supporting IQM s expansion strategy to build quantum computers in Germany, in a statement.

The news comes as the Finnish government announced only last week that it would acquire a quantum computer with 20.7M for the Finnish State Research center VTT.

It has been a mind-blowing forty-million past week for quantum computers in Finland. IQM staff is excited to work together with VTT, Aalto University, and CSC in this ecosystem, rejoices Prof. Mikko Mttnen, Chief Scientist and co-founder of IQM.

Previously, the German government said it would put 2bn into commissioning at least two quantum computers.

IQM thus now plans to expand its operations in Germany via its team in Munich.

IQM will build co-design quantum computers for commercial applications and install testing facilities for quantum processors, said Prof. Enrique Solano, CEO of IQM Germany.

The company is focusing on superconducting quantum processors, which are streamlined for commercial applications in a Co-Design approach. This works by providing the full hardware stack for a quantum computer, integrating different technologies, and then invites collaborations with quantum software companies.

IQM was one of the 72 to succeed in the selection process of the EIC. Altogether 3969 companies applied for this funding.

See the original post:
European quantum computing startup takes its funding to 32M with fresh raise - TechCrunch

Quantum Computing And The End Of Encryption – Hackaday

Quantum computers stand a good chance of changing the face computing, and that goes double for encryption. For encryption methods that rely on the fact that brute-forcing the key takes too long with classical computers, quantum computing seems like its logical nemesis.

For instance, the mathematical problem that lies at the heart of RSA and other public-key encryption schemes is factoring a product of two prime numbers. Searching for the right pair using classical methods takes approximately forever, but Shors algorithm can be used on a suitable quantum computer to do the required factorization of integers in almost no time.

When quantum computers become capable enough, the threat to a lot of our encrypted communication is a real one. If one can no longer rely on simply making the brute-forcing of a decryption computationally heavy, all of todays public-key encryption algorithms are essentially useless. This is the doomsday scenario, but how close are we to this actually happening, and what can be done?

To ascertain the real threat, one has to look at the classical encryption algorithms in use today to see which parts of them would be susceptible to being solved by a quantum algorithm in significantly less time than it would take for a classical computer. In particular, we should make the distinction between symmetric and asymmetric encryption.

Symmetric algorithms can be encoded and decoded with the same secret key, and that has to be shared between communication partners through a secure channel. Asymmetric encryption uses a private key for decryption and a public key for encryption onlytwo keys: a private key and a public key. A message encrypted with the public key can only be decrypted with the private key. This enables public-key cryptography: the public key can be shared freely without fear of impersonation because it can only be used to encrypt and not decrypt.

As mentioned earlier, RSA is one cryptosystem which is vulnerable to quantum algorithms, on account of its reliance on integer factorization. RSA is an asymmetric encryption algorithm, involving a public and private key, which creates the so-called RSA problem. This occurs when one tries to perform a private-key operation when only the public key is known, requiring finding the eth roots of an arbitrary number, modulo N. Currently this is unrealistic to classically solve for >1024 bit RSA key sizes.

Here we see again the thing that makes quantum computing so fascinating: the ability to quickly solve non-deterministic polynomial (NP) problems. Whereas some NP problems can be solved quickly by classical computers, they do this by approximating a solution. NP-complete problems are those for which no classical approximation algorithm can be devised. An example of this is the Travelling Salesman Problem (TSP), which asks to determine the shortest possible route between a list of cities, while visiting each city once and returning to the origin city.

Even though TSP can be solved with classical computing for smaller number of cities (tens of thousands), larger numbers require approximation to get within 1%, as solving them would require excessively long running times.

Symmetric encryption algorithms are commonly used for live traffic, with only handshake and the initial establishing of a connection done using (slower) asymmetric encryption as a secure channel for exchanging of the symmetric keys. Although symmetric encryption tends to be faster than asymmetric encryption, it relies on both parties having access to the shared secret, instead of being able to use a public key.

Symmetric encryption is used with forward secrecy (also known as perfect forward secrecy). The idea behind FS being that instead of only relying on the security provided by the initial encrypted channel, one also encrypts the messages before they are being sent. This way even if the keys for the encryption channel got compromised, all an attacker would end up with are more encrypted messages, each encrypted using a different ephemeral key.

FS tends to use Diffie-Hellman key exchange or similar, resulting in a system that is comparable to a One-Time Pad (OTP) type of encryption, that only uses the encryption key once. Using traditional methods, this means that even after obtaining the private key and cracking a single message, one has to spend the same effort on every other message as on that first one in order to read the entire conversation. This is the reason why many secure chat programs like Signal as well as increasingly more HTTPS-enabled servers use FS.

It was already back in 1996 that Lov Grover came up with Grovers algorithm, which allows for a roughly quadratic speed-up as a black box search algorithm. Specifically it finds with high probability the likely input to a black box (like an encryption algorithm) which produced the known output (the encrypted message).

As noted by Daniel J. Bernstein, the creation of quantum computers that can effectively execute Grovers algorithm would necessitate at least the doubling of todays symmetric key lengths. This in addition to breaking RSA, DSA, ECDSA and many other cryptographic systems.

The observant among us may have noticed that despite some spurious marketing claims over the past years, we are rather short on actual quantum computers today. When it comes to quantum computers that have actually made it out of the laboratory and into a commercial setting, we have quantum annealing systems, with D-Wave being a well-known manufacturer of such systems.

Quantum annealing systems can only solve a subset of NP-complete problems, of which the travelling salesman problem, with a discrete search space. It would for example not be possible to run Shors algorithm on a quantum annealing system. Adiabatic quantum computation is closely related to quantum annealing and therefore equally unsuitable for a general-purpose quantum computing system.

This leaves todays quantum computing research thus mostly in the realm of simulations, and classical encryption mostly secure (for now).

When can we expect to see quantum computers that can decrypt every single one of our communications with nary any effort? This is a tricky question. Much of it relies on when we can get a significant number of quantum bits, or qubits, together into something like a quantum circuit model with sufficient error correction to make the results anywhere as reliable as those of classical computers.

At this point in time one could say that we are still trying to figure out what the basic elements of a quantum computer will look like. This has led to the following quantum computing models:

Of these four models, quantum annealing has been implemented and commercialized. The others have seen many physical realizations in laboratory settings, but arent up to scale yet. In many ways it isnt dissimilar to the situation that classical computers found themselves in throughout the 19th and early 20th century when successive computers found themselves moving from mechanical systems to relays and valves, followed by discrete transistors and ultimately (for now) countless transistors integrated into singular chips.

It was the discovery of semiconducting materials and new production processes that allowed classical computers to flourish. For quantum computing the question appears to be mostly a matter of when well manage to do the same there.

Even if in a decade or more from the quantum computing revolution will suddenly make our triple-strength, military-grade encryption look as robust as DES does today, we can always comfort ourselves with the knowledge that along with quantum computing we are also increasingly learning more about quantum cryptography.

In many ways quantum cryptography is even more exciting than classical cryptography, as it can exploit quantum mechanical properties. Best known is quantum key distribution (QKD), which uses the process of quantum communication to establish a shared key between two parties. The fascinating property of QKD is that the mere act of listening in on this communication will cause measurable changes. Essentially this provides unconditional security in distributing symmetric key material, and symmetric encryption is significantly more quantum-resistant.

All of this means that even if the coming decades are likely to bring some form of upheaval that may or may not mean the end of classical computing and cryptography with it, not all is lost. As usual, science and technology with it will progress, and future generations will look back on todays primitive technology with some level of puzzlement.

For now, using TLS 1.3 and any other protocols that support forward secrecy, and symmetric encryption in general, is your best bet.

See the original post here:
Quantum Computing And The End Of Encryption - Hackaday