Archive for the ‘Quantum Computer’ Category

What is cloud-based quantum computing and How does it work? – Medium

credit: cloud.report

Quantum computers really do represent the future generation of computing. Cloud-based quantum computing is tougher to drag off than AI, therefore the ramp-up is going to be slower, and therefore the learning curve vessel attributable to the rather nebulous science behind it, a sensible, operating quantum computer remains a flight of fancy. Bits are the elemental computing units, however, they will store only two values 0 and 1. Developers use quantum computing to encrypt issues as qubits, that work out multiple mixtures of variables promptly instead of exploring every possibility discretely. The deployment of quantum circuits and therefore the support systems necessary for their operation could be an expensive and troublesome process. Among the scope of the analysis, firms that already use these systems modify cloud-based quantum computing via the platforms they build.

Many startups and technology giants, together with Microsoft, IBM, and Google, acknowledge the worth of creating progress during this field, as this is often so successive major step in technology and computing. Quantum computers area unit lightning-fast compared to a typical Windows 10 computer or a macOS computer that makes them even quicker than the foremost powerful supercomputers we have these days. Once users area unit allowed to access quantum physics-powered computers via the web, then its quantum computing within the cloud.

Rigetti computing could be a startup that has developed a quantum processor thats in operation and Computing 128 qubits. They recently declared a Quantum Cloud Service, that developed on its existing quantum computing within the Cloud programming toolkit. This service can bring each ancient and quantum computer along on one cloud platform to assist users to build applications exploitation the ability of qubit technology.

Bill Gates~ It isnt clear when it will work or become mainstream. There is a chance that within 610 years that cloud computing will offer super-computation by using quantum. It could help use solve some very important science problems including materials and catalyst design.

It will create a distinction in several areas with enhancements in implementation and error correction. This new technology can reach a useful purpose with the participation of a lot of individuals and their collaboration. Cloud-based quantum computing offers an immediate interface to quantum circuits and quantum chips sanctioning final testing of quantum algorithms and provides how that allows individuals to create enhancements in quantum computing. Businesses and other domains will apply by exploitation QC on the cloud and dont ought to look forward to quantum computing technology being mature and widespread.

See original here:
What is cloud-based quantum computing and How does it work? - Medium

Will Quantum Computers Break Bitcoin and the Internet? Heres the Outlook From Quantum Physicist Anastasia Marchenkova – The Daily Hodl

A Quantum physicist is revealing that while quantum computers pose no risk to Bitcoin mining, they threaten the algorithms that keep Bitcoin and the internet secure.

In a recent video, Anastasia Marchenkova argues Bitcoin has a built-in design that protects it against entities using quantum algorithms to mine BTC at a rapid rate.

Lets say one day we actually did discover a quantum algorithm that could solve this faster. Bitcoin is designed to adjust the difficulty if we mine blocks too fast. So even if we found this quantum algorithm, the difficulty would just get harder.

However, the quantum physicist warns that quantum computing poses a serious risk to cryptographic algorithms which keep cryptocurrencies and the internet at large secure.

Theres two common cryptosystems RSA and elliptic curve encryption and these are affected by quantum computers. When youre online, information that you send is encrypted, often with these two. Both of these are vulnerable to attacks by quantum computers which means a large enough quantum computer will be a problem for anyone online

There actually is a quantum algorithm to break RSA and elliptic curve encryption. Bitcoin does use elliptic curve encryption (ECC) to generate the public key, which is created from the private key which authorizes transactions

That means that someone with a large enough and coherent enough quantum computer, with coherence meaning the length of time the quantum information can be stored, can actually get your private key from your public key and thats a very serious problem That private key can then be used to authorize transactions that the owner doesnt want to have happen. So as quantum computers become better and better, the security of RSA and elliptic curve is no longer effective.

Crypto sleuths continue to track the advancement of quantum machines. They have the capability to crack complex mathematical problems using quantum bits, or quibits, which can maintain a superimposition by being in two states at the same time.

While the future of cryptocurrencies may be threatened, Marchenkova says digital assets can adopt developments that can effectively resist quantum-based attacks.

So well need to pick an algorithm that can actually stand up to quantum attacks. We call this post-quantum cryptography which are classical algorithms not based on quantum principles that can stand up to quantum computing attacks. One of the current leading candidates is lattice-based cryptography

Another approach is using asymmetric cryptography like AES (advanced encryption standard) which is weakened by quantum computers but not broken in such a manner like RSA and elliptic curve

There are also other coins already using hash-based cryptography. And so far, like I mentioned, hash-based cryptosystems actually resist quantum computing attacks. We dont know if thats going to hold true forever but so far that seems to be the case.

I

Featured Image: Shutterstock/GrandeDuc

Read the original:
Will Quantum Computers Break Bitcoin and the Internet? Heres the Outlook From Quantum Physicist Anastasia Marchenkova - The Daily Hodl

Quantum computer | computer science | Britannica

Quantum computer, device that employs properties described by quantum mechanics to enhance computations.

Britannica Quiz

Computers and Technology Quiz

Computers host websites composed of HTML and send text messages as simple as...LOL. Hack into this quiz and let some technology tally your score and reveal the contents to you.

As early as 1959 the American physicist and Nobel laureate Richard Feynman noted that, as electronic components begin to reach microscopic scales, effects predicted by quantum mechanics occurwhich, he suggested, might be exploited in the design of more powerful computers. In particular, quantum researchers hope to harness a phenomenon known as superposition. In the quantum mechanical world, objects do not necessarily have clearly defined states, as demonstrated by the famous experiment in which a single photon of light passing through a screen with two small slits will produce a wavelike interference pattern, or superposition of all available paths. (See wave-particle duality.) However, when one slit is closedor a detector is used to determine which slit the photon passed throughthe interference pattern disappears. In consequence, a quantum system exists in all possible states before a measurement collapses the system into one state. Harnessing this phenomenon in a computer promises to expand computational power greatly. A traditional digital computer employs binary digits, or bits, that can be in one of two states, represented as 0 and 1; thus, for example, a 4-bit computer register can hold any one of 16 (24) possible numbers. In contrast, a quantum bit (qubit) exists in a wavelike superposition of values from 0 to 1; thus, for example, a 4-qubit computer register can hold 16 different numbers simultaneously. In theory, a quantum computer can therefore operate on a great many values in parallel, so that a 30-qubit quantum computer would be comparable to a digital computer capable of performing 10 trillion floating-point operations per second (TFLOPS)comparable to the speed of the fastest supercomputers.

During the 1980s and 90s the theory of quantum computers advanced considerably beyond Feynmans early speculations. In 1985 David Deutsch of the University of Oxford described the construction of quantum logic gates for a universal quantum computer, and in 1994 Peter Shor of AT&T devised an algorithm to factor numbers with a quantum computer that would require as few as six qubits (although many more qubits would be necessary for factoring large numbers in a reasonable time). When a practical quantum computer is built, it will break current encryption schemes based on multiplying two large primes; in compensation, quantum mechanical effects offer a new method of secure communication known as quantum encryption. However, actually building a useful quantum computer has proved difficult. Although the potential of quantum computers is enormous, the requirements are equally stringent. A quantum computer must maintain coherence between its qubits (known as quantum entanglement) long enough to perform an algorithm; because of nearly inevitable interactions with the environment (decoherence), practical methods of detecting and correcting errors need to be devised; and, finally, since measuring a quantum system disturbs its state, reliable methods of extracting information must be developed.

Plans for building quantum computers have been proposed; although several demonstrate the fundamental principles, none is beyond the experimental stage. Three of the most promising approaches are presented below: nuclear magnetic resonance (NMR), ion traps, and quantum dots.

In 1998 Isaac Chuang of the Los Alamos National Laboratory, Neil Gershenfeld of the Massachusetts Institute of Technology (MIT), and Mark Kubinec of the University of California at Berkeley created the first quantum computer (2-qubit) that could be loaded with data and output a solution. Although their system was coherent for only a few nanoseconds and trivial from the perspective of solving meaningful problems, it demonstrated the principles of quantum computation. Rather than trying to isolate a few subatomic particles, they dissolved a large number of chloroform molecules (CHCL3) in water at room temperature and applied a magnetic field to orient the spins of the carbon and hydrogen nuclei in the chloroform. (Because ordinary carbon has no magnetic spin, their solution used an isotope, carbon-13.) A spin parallel to the external magnetic field could then be interpreted as a 1 and an antiparallel spin as 0, and the hydrogen nuclei and carbon-13 nuclei could be treated collectively as a 2-qubit system. In addition to the external magnetic field, radio frequency pulses were applied to cause spin states to flip, thereby creating superimposed parallel and antiparallel states. Further pulses were applied to execute a simple algorithm and to examine the systems final state. This type of quantum computer can be extended by using molecules with more individually addressable nuclei. In fact, in March 2000 Emanuel Knill, Raymond Laflamme, and Rudy Martinez of Los Alamos and Ching-Hua Tseng of MIT announced that they had created a 7-qubit quantum computer using trans-crotonic acid. However, many researchers are skeptical about extending magnetic techniques much beyond 10 to 15 qubits because of diminishing coherence among the nuclei.

Just one week before the announcement of a 7-qubit quantum computer, physicist David Wineland and colleagues at the U.S. National Institute for Standards and Technology (NIST) announced that they had created a 4-qubit quantum computer by entangling four ionized beryllium atoms using an electromagnetic trap. After confining the ions in a linear arrangement, a laser cooled the particles almost to absolute zero and synchronized their spin states. Finally, a laser was used to entangle the particles, creating a superposition of both spin-up and spin-down states simultaneously for all four ions. Again, this approach demonstrated basic principles of quantum computing, but scaling up the technique to practical dimensions remains problematic.

Quantum computers based on semiconductor technology are yet another possibility. In a common approach a discrete number of free electrons (qubits) reside within extremely small regions, known as quantum dots, and in one of two spin states, interpreted as 0 and 1. Although prone to decoherence, such quantum computers build on well-established, solid-state techniques and offer the prospect of readily applying integrated circuit scaling technology. In addition, large ensembles of identical quantum dots could potentially be manufactured on a single silicon chip. The chip operates in an external magnetic field that controls electron spin states, while neighbouring electrons are weakly coupled (entangled) through quantum mechanical effects. An array of superimposed wire electrodes allows individual quantum dots to be addressed, algorithms executed, and results deduced. Such a system necessarily must be operated at temperatures near absolute zero to minimize environmental decoherence, but it has the potential to incorporate very large numbers of qubits.

Read the original:
Quantum computer | computer science | Britannica

How and when quantum computers will improve machine learning? – Medium

The different strategies toward quantum machine learningThey say you should start an article with a cool fancy image. Google 72 qubits chip Sycamore Google

There is a strong hope (and hype) that Quantum Computers will help machine learning in many ways. Research in Quantum Machine Learning (QML) is a very active domain, and many small and noisy quantum computers are now available. Different approaches exist, for both long term and short term, and we may wonder what are their respective hopes and limitations, both in theory and in practice?

It all started in 2009 with the publications of the HHL Algorithm [1] proving an exponential acceleration for matrix multiplication and inversion, which triggered exciting applications in all linear algebra-based science, hence machine learning. Since, many algorithms were proposed to speed up tasks such as classification [2], dimensionality reduction [3], clustering [4], recommendation system [5], neural networks [6], kernel methods [7], SVM [8], reinforcement learning [9], and more generally optimization [10].

These algorithms are what I call Long Term or Algorithmic QML. They are usually carefully detailed, with guarantees that are proven as mathematical theorems. We can (theoretically) know the amount of speedup compared to the classical algorithms they reproduce, which are often polynomial or even exponential, with respect to the number of input data for most of the cases. They come with precise bounds on the results probability, randomness, and accuracy, as usual in computer science research.

While they constitute theoretical proof that a universal and fault-tolerant quantum computer would provide impressive benefits in ML, early warnings [11] showed that some underlying assumptions were very constraining.

These algorithms often require loading the data with a Quantum Random Access Memory, or QRAM [12], a bottleneck part without which exponential speedups are much more complex to obtain. Besides, they sometimes need long quantum circuits and many logical qubits (which, due to error correction, are themselves composed of many more physical qubits), that might not be arriving soon enough.

When exactly? When we will reach the Universal Fault-Tolerant Quantum Computer, predicted by Google in 2029, or by IonQ in only 5 years. More conservative opinion claim this will not happen before 20+ years, and some even say we will never reach that point. Future will tell!

More recently, a mini earthquake amplified by scientific media has cast doubt on the efficiency of Algorithm QML: the so-called dequantization papers [13] that introduced classical algorithms inspired from the quantum ones to obtain similar exponential speedups, in the field of QML at least. This impressive result was then hindered by the fact that the equivalent speedup only concerns the number of data, and comes at a cost of a terrible polynomial slowdown with respect to other parameters for now. This makes these quantum-inspired classical algorithms currently unusable in practice [14].

In the meantime, something very exciting happened: actual quantum computers were built and became accessible. You can play with noisy devices made of 5 to 20 qubits, and soon more. Quite recently Google performed a quantum circuit with 53 qubits [15], the first that could not be efficiently simulable by a classical computer.

Researchers have then been looking at new models that these noisy intermediate scale quantum computers (NISQ) could actually perform [16]. They are all based on the same idea of variational quantum circuits (VQC), inspired by classical machine learning.

The main difference with algorithmic QML is that the circuit is not implementing a known classical ML algorithm. One would simply hope that the chosen circuit will converge to successfully classify data or predict values. For now, there are several types of circuits in the literature [17] and we start to see interesting patterns in the success. The problem itself is often encoded in the loss function we try to decrease: we sum the error made compared to the true values or labels, or compared to the quantum states we aim for, or to the energy levels, and so on, depending on the task. Active research tries to understand why some circuits work better than others on certain tasks, and why quantumness would help.

Another core difference is that many providers [18, 19, 20] allow you to program these VQC so you can play and test them on actual quantum computers!

In recent years, researchers have tried to find use cases where Variational QML would succeed at classical problems, or even outperforms the classical solutions [21, 22]. Some hope that the variational nature of the training confers some resilience to hardware noise. If this happens to be the case, it would be beneficial not to wait for Error Correction models that require many qubits. One would only need Error Mitigation techniques to post-process the measurements.

On the theoretical side, researchers hope that quantum superposition and entangling quantum gates would project data in a much bigger space (the Hilbert Space of n qubits has dimension 2^n) where some classically inaccessible correlations or separations can be done. Said differently, some believe that the quantum model will be more expressive.

It is important to notice that research on Variational QML is less focused on proving computational speedups. The main interest is to reach a more expressive or complex state of information processing. The two approaches are related but they represent two different strategies. Unfortunately, less is proven compared to Algorithmic QML, and we are far from understanding the theoretical reasons that would prove the advantage of these quantum computations.

Of course, due to the limitations of the current quantum devices, experiments are often made on a small number of qubits (4 qubits in the above graph) or on simulators, often ideal or limited to 30+ qubits. It is hard to predict what will happen when the number of qubits will grow.

Despite the excitement, VQC also suffers from theoretical disturbance. It is proven that when the number of qubits or the number of gates becomes too big, the optimization landscape will be flat and hinder the ability to optimize the circuit. Many efforts are made to circumvent this issue, called Barren Plateaus [23], by using specific circuits [24] or smart initialization of the parameters [25].

But Barren Plateaus are not the only caveat. In many optimization methods, one must compute the gradient of a cost function with respect to each parameter. Said differently, we want to know how much the model is improved when I modify each parameter. In classical neural networks, computing the gradients is usually done using backpropagation because we analytically understand the operations. With VQC, operations become too complex, and we cannot access intermediate quantum states (without measuring and therefore destroying them).

The current state-of-the-art solution is called the parameter shift rule [27, 28] and requires to apply the circuit and measure its result 2 times for each parameter. By comparison, in classical deep learning, the network is applied just once forward and once backward to obtain all thousand or millions gradients. Hopefully, we could parallelize the parameter shift rule on many simulators or quantum devices, but this could be limited for a large number of parameters.

Finally, researchers tend to focus more and more on the importance of data loading into a quantum state [29], also called feature map [30]. Without the ideal amplitude encoding obtained with the QRAM, there are doubts that we will be able to load and process high dimensional classical data with an exponential or high polynomial factor. Some hope remains on data independent tasks such as generative models [21, 31] or solving partial differential equations.

Note that the expression Quantum Neural Networks has been used to show the similarities with classical Neural Networks (NN) training. However they are not equivalent, since the VQC dont have the same hidden layers architecture, and neither have natural non linearities, unless a measurement is performed. And theres no simple rule to convert any NN to a VQC or vice versa. Some now prefer to compare VQC to Kernel Methods [30].

We now have a better understanding of the advantages and weaknesses of the two main strategies towards quantum machine learning. Current research is now focused on two aspects:

Finally, and most importantly, improve the quantum devices! We all hope for constant incremental improvements or a paradigm shift in the quality of the qubits, their number, the error correction process, to reach powerful enough machines. Please physicists, can you hurry?

PS: lets not forget to use all this amazing science to do good things that will benefit everyone.

Jonas Landman is a Ph.D. student at the University of Paris under the supervision of Prof. Iordanis Kerenidis. He is Technical Advisor at QC Ware and member of QuantX. He has previously studied at Ecole Polytechnique and UC Berkeley.

Read more:
How and when quantum computers will improve machine learning? - Medium

Quantum computing: Honeywell just quadrupled the power of its computer – ZDNet

The System Model H1, a ten-qubit quantum computer, has reached a quantum volume of 512.

Honeywell's quantum scientists have quadrupled the capabilities of the company's quantum computer, with the device achieving record levels of performance less than a year after the first generation of the system was released.

The System Model H1, a ten-qubit quantum computer, effectively reached a quantum volume of 512 four times as much as was attained in the previous tweak of the system, which saw the H1 reach a quantum volume of 128.

Released commercially last June (at the time as the System Model H0), the H1 makes use of trapped ions, unlike IBM and Google's devices, which are built with superconducting qubits. Honeywell's new record is eight times as much as was achieved with the System Model H0, which launched with a quantum volume of 64.

Quantum volume is a concept that IBM developed in 2017 as a way of measuring various aspects of a quantum computer's performance; in simple terms, the higher the quantum volume, the higher the potential for resolving real-world problems across industry and research. Designed to be independent of the architecture of any given quantum computer, quantum volume can measure any system that runs quantum circuits.

SEE: Hiring Kit: Computer Hardware Engineer (TechRepublic Premium)

For example, one measurement that is indicative of a quantum computer's capabilities is qubit fidelity, which is critical to understanding how well a device can implement quantum code. According to Honeywell, the average single-qubit gate fidelity in the latest version of the H1 was 99.991%.

The final number that determines quantum volume is an aggregate of many other measurements and tests of a single quantum system's operations: they include the number of physical qubits in the quantum computer, but also the device's error rate, and connectivity, which reflects the extent to which qubits can be fully connected to each other within the device.

This is why it is possible for a quantum system to reach a high quantum volume, even with few qubits. Despite having only ten qubits, for instance, Honeywell's System Model H1 performs well when it comes to error rates and connectivity, which has earned the device a top spot for its overall capabilities. In comparison,last year IBM's 27-qubit client-deployed system achieved a quantum volume of 64.

The new milestone, therefore, hasprompted Honeywell's president of quantum solutions Tony Uttley to describethe System Model H1 as "the highest performing quantum computing system in the world."

Honeywell has made no secret of its strategy, which consists of focusing on qubit fidelity and connectedness, before attempting to scale up the number of qubits. "When you hear about fidelity and error, that's about the quality of the quantum operation," Uttley told ZDNet. "It's about knowing how often you get the right answer when you run these quantum algorithms."

"We have taken one approach that is very unique when it comes to how to get the most out of these near-term systems," he continued. "Nobody is talking about millions of qubits right now we're talking about tens of qubits. To get the most out of these tens of qubits, you have to have super-high fidelity, fully-connected and highly-controlled systems. That's our approach."

SEE: The EU wants to build its first quantum computer. That plan might not be ambitious enough

Making these highly reliable systems available to Honeywell's customers now enables businesses to test and trial with small-scale applications while waiting for the company to design and build new generations of more capable quantum computers, according to Uttley.

Honeywell recently introduced the first subscription-based plan for the usage of the H1,which grants paying customers a monthly access to the machine.

With only ten qubits, there is little that the device can achieve on top of proofs of concepts, designed to be implemented in full scale once a larger computer is available; but high-profile customers are nevertheless flocking to Honeywell's services.

J.P. Morgan Chase, for example, is investigating how the company's quantum computermight improve operations in banking; and BMW ispiloting the use of Honeywell's hardwareto optimize supply chains for car manufacturing.

Read this article:
Quantum computing: Honeywell just quadrupled the power of its computer - ZDNet