Archive for the ‘Quantum Computer’ Category

Quantum computer | computer science | Britannica

Quantum computer, device that employs properties described by quantum mechanics to enhance computations.

Britannica Quiz

Computers and Technology Quiz

Computers host websites composed of HTML and send text messages as simple as...LOL. Hack into this quiz and let some technology tally your score and reveal the contents to you.

As early as 1959 the American physicist and Nobel laureate Richard Feynman noted that, as electronic components begin to reach microscopic scales, effects predicted by quantum mechanics occurwhich, he suggested, might be exploited in the design of more powerful computers. In particular, quantum researchers hope to harness a phenomenon known as superposition. In the quantum mechanical world, objects do not necessarily have clearly defined states, as demonstrated by the famous experiment in which a single photon of light passing through a screen with two small slits will produce a wavelike interference pattern, or superposition of all available paths. (See wave-particle duality.) However, when one slit is closedor a detector is used to determine which slit the photon passed throughthe interference pattern disappears. In consequence, a quantum system exists in all possible states before a measurement collapses the system into one state. Harnessing this phenomenon in a computer promises to expand computational power greatly. A traditional digital computer employs binary digits, or bits, that can be in one of two states, represented as 0 and 1; thus, for example, a 4-bit computer register can hold any one of 16 (24) possible numbers. In contrast, a quantum bit (qubit) exists in a wavelike superposition of values from 0 to 1; thus, for example, a 4-qubit computer register can hold 16 different numbers simultaneously. In theory, a quantum computer can therefore operate on a great many values in parallel, so that a 30-qubit quantum computer would be comparable to a digital computer capable of performing 10 trillion floating-point operations per second (TFLOPS)comparable to the speed of the fastest supercomputers.

During the 1980s and 90s the theory of quantum computers advanced considerably beyond Feynmans early speculations. In 1985 David Deutsch of the University of Oxford described the construction of quantum logic gates for a universal quantum computer, and in 1994 Peter Shor of AT&T devised an algorithm to factor numbers with a quantum computer that would require as few as six qubits (although many more qubits would be necessary for factoring large numbers in a reasonable time). When a practical quantum computer is built, it will break current encryption schemes based on multiplying two large primes; in compensation, quantum mechanical effects offer a new method of secure communication known as quantum encryption. However, actually building a useful quantum computer has proved difficult. Although the potential of quantum computers is enormous, the requirements are equally stringent. A quantum computer must maintain coherence between its qubits (known as quantum entanglement) long enough to perform an algorithm; because of nearly inevitable interactions with the environment (decoherence), practical methods of detecting and correcting errors need to be devised; and, finally, since measuring a quantum system disturbs its state, reliable methods of extracting information must be developed.

Plans for building quantum computers have been proposed; although several demonstrate the fundamental principles, none is beyond the experimental stage. Three of the most promising approaches are presented below: nuclear magnetic resonance (NMR), ion traps, and quantum dots.

In 1998 Isaac Chuang of the Los Alamos National Laboratory, Neil Gershenfeld of the Massachusetts Institute of Technology (MIT), and Mark Kubinec of the University of California at Berkeley created the first quantum computer (2-qubit) that could be loaded with data and output a solution. Although their system was coherent for only a few nanoseconds and trivial from the perspective of solving meaningful problems, it demonstrated the principles of quantum computation. Rather than trying to isolate a few subatomic particles, they dissolved a large number of chloroform molecules (CHCL3) in water at room temperature and applied a magnetic field to orient the spins of the carbon and hydrogen nuclei in the chloroform. (Because ordinary carbon has no magnetic spin, their solution used an isotope, carbon-13.) A spin parallel to the external magnetic field could then be interpreted as a 1 and an antiparallel spin as 0, and the hydrogen nuclei and carbon-13 nuclei could be treated collectively as a 2-qubit system. In addition to the external magnetic field, radio frequency pulses were applied to cause spin states to flip, thereby creating superimposed parallel and antiparallel states. Further pulses were applied to execute a simple algorithm and to examine the systems final state. This type of quantum computer can be extended by using molecules with more individually addressable nuclei. In fact, in March 2000 Emanuel Knill, Raymond Laflamme, and Rudy Martinez of Los Alamos and Ching-Hua Tseng of MIT announced that they had created a 7-qubit quantum computer using trans-crotonic acid. However, many researchers are skeptical about extending magnetic techniques much beyond 10 to 15 qubits because of diminishing coherence among the nuclei.

Just one week before the announcement of a 7-qubit quantum computer, physicist David Wineland and colleagues at the U.S. National Institute for Standards and Technology (NIST) announced that they had created a 4-qubit quantum computer by entangling four ionized beryllium atoms using an electromagnetic trap. After confining the ions in a linear arrangement, a laser cooled the particles almost to absolute zero and synchronized their spin states. Finally, a laser was used to entangle the particles, creating a superposition of both spin-up and spin-down states simultaneously for all four ions. Again, this approach demonstrated basic principles of quantum computing, but scaling up the technique to practical dimensions remains problematic.

Quantum computers based on semiconductor technology are yet another possibility. In a common approach a discrete number of free electrons (qubits) reside within extremely small regions, known as quantum dots, and in one of two spin states, interpreted as 0 and 1. Although prone to decoherence, such quantum computers build on well-established, solid-state techniques and offer the prospect of readily applying integrated circuit scaling technology. In addition, large ensembles of identical quantum dots could potentially be manufactured on a single silicon chip. The chip operates in an external magnetic field that controls electron spin states, while neighbouring electrons are weakly coupled (entangled) through quantum mechanical effects. An array of superimposed wire electrodes allows individual quantum dots to be addressed, algorithms executed, and results deduced. Such a system necessarily must be operated at temperatures near absolute zero to minimize environmental decoherence, but it has the potential to incorporate very large numbers of qubits.

Read the original:
Quantum computer | computer science | Britannica

How and when quantum computers will improve machine learning? – Medium

The different strategies toward quantum machine learningThey say you should start an article with a cool fancy image. Google 72 qubits chip Sycamore Google

There is a strong hope (and hype) that Quantum Computers will help machine learning in many ways. Research in Quantum Machine Learning (QML) is a very active domain, and many small and noisy quantum computers are now available. Different approaches exist, for both long term and short term, and we may wonder what are their respective hopes and limitations, both in theory and in practice?

It all started in 2009 with the publications of the HHL Algorithm [1] proving an exponential acceleration for matrix multiplication and inversion, which triggered exciting applications in all linear algebra-based science, hence machine learning. Since, many algorithms were proposed to speed up tasks such as classification [2], dimensionality reduction [3], clustering [4], recommendation system [5], neural networks [6], kernel methods [7], SVM [8], reinforcement learning [9], and more generally optimization [10].

These algorithms are what I call Long Term or Algorithmic QML. They are usually carefully detailed, with guarantees that are proven as mathematical theorems. We can (theoretically) know the amount of speedup compared to the classical algorithms they reproduce, which are often polynomial or even exponential, with respect to the number of input data for most of the cases. They come with precise bounds on the results probability, randomness, and accuracy, as usual in computer science research.

While they constitute theoretical proof that a universal and fault-tolerant quantum computer would provide impressive benefits in ML, early warnings [11] showed that some underlying assumptions were very constraining.

These algorithms often require loading the data with a Quantum Random Access Memory, or QRAM [12], a bottleneck part without which exponential speedups are much more complex to obtain. Besides, they sometimes need long quantum circuits and many logical qubits (which, due to error correction, are themselves composed of many more physical qubits), that might not be arriving soon enough.

When exactly? When we will reach the Universal Fault-Tolerant Quantum Computer, predicted by Google in 2029, or by IonQ in only 5 years. More conservative opinion claim this will not happen before 20+ years, and some even say we will never reach that point. Future will tell!

More recently, a mini earthquake amplified by scientific media has cast doubt on the efficiency of Algorithm QML: the so-called dequantization papers [13] that introduced classical algorithms inspired from the quantum ones to obtain similar exponential speedups, in the field of QML at least. This impressive result was then hindered by the fact that the equivalent speedup only concerns the number of data, and comes at a cost of a terrible polynomial slowdown with respect to other parameters for now. This makes these quantum-inspired classical algorithms currently unusable in practice [14].

In the meantime, something very exciting happened: actual quantum computers were built and became accessible. You can play with noisy devices made of 5 to 20 qubits, and soon more. Quite recently Google performed a quantum circuit with 53 qubits [15], the first that could not be efficiently simulable by a classical computer.

Researchers have then been looking at new models that these noisy intermediate scale quantum computers (NISQ) could actually perform [16]. They are all based on the same idea of variational quantum circuits (VQC), inspired by classical machine learning.

The main difference with algorithmic QML is that the circuit is not implementing a known classical ML algorithm. One would simply hope that the chosen circuit will converge to successfully classify data or predict values. For now, there are several types of circuits in the literature [17] and we start to see interesting patterns in the success. The problem itself is often encoded in the loss function we try to decrease: we sum the error made compared to the true values or labels, or compared to the quantum states we aim for, or to the energy levels, and so on, depending on the task. Active research tries to understand why some circuits work better than others on certain tasks, and why quantumness would help.

Another core difference is that many providers [18, 19, 20] allow you to program these VQC so you can play and test them on actual quantum computers!

In recent years, researchers have tried to find use cases where Variational QML would succeed at classical problems, or even outperforms the classical solutions [21, 22]. Some hope that the variational nature of the training confers some resilience to hardware noise. If this happens to be the case, it would be beneficial not to wait for Error Correction models that require many qubits. One would only need Error Mitigation techniques to post-process the measurements.

On the theoretical side, researchers hope that quantum superposition and entangling quantum gates would project data in a much bigger space (the Hilbert Space of n qubits has dimension 2^n) where some classically inaccessible correlations or separations can be done. Said differently, some believe that the quantum model will be more expressive.

It is important to notice that research on Variational QML is less focused on proving computational speedups. The main interest is to reach a more expressive or complex state of information processing. The two approaches are related but they represent two different strategies. Unfortunately, less is proven compared to Algorithmic QML, and we are far from understanding the theoretical reasons that would prove the advantage of these quantum computations.

Of course, due to the limitations of the current quantum devices, experiments are often made on a small number of qubits (4 qubits in the above graph) or on simulators, often ideal or limited to 30+ qubits. It is hard to predict what will happen when the number of qubits will grow.

Despite the excitement, VQC also suffers from theoretical disturbance. It is proven that when the number of qubits or the number of gates becomes too big, the optimization landscape will be flat and hinder the ability to optimize the circuit. Many efforts are made to circumvent this issue, called Barren Plateaus [23], by using specific circuits [24] or smart initialization of the parameters [25].

But Barren Plateaus are not the only caveat. In many optimization methods, one must compute the gradient of a cost function with respect to each parameter. Said differently, we want to know how much the model is improved when I modify each parameter. In classical neural networks, computing the gradients is usually done using backpropagation because we analytically understand the operations. With VQC, operations become too complex, and we cannot access intermediate quantum states (without measuring and therefore destroying them).

The current state-of-the-art solution is called the parameter shift rule [27, 28] and requires to apply the circuit and measure its result 2 times for each parameter. By comparison, in classical deep learning, the network is applied just once forward and once backward to obtain all thousand or millions gradients. Hopefully, we could parallelize the parameter shift rule on many simulators or quantum devices, but this could be limited for a large number of parameters.

Finally, researchers tend to focus more and more on the importance of data loading into a quantum state [29], also called feature map [30]. Without the ideal amplitude encoding obtained with the QRAM, there are doubts that we will be able to load and process high dimensional classical data with an exponential or high polynomial factor. Some hope remains on data independent tasks such as generative models [21, 31] or solving partial differential equations.

Note that the expression Quantum Neural Networks has been used to show the similarities with classical Neural Networks (NN) training. However they are not equivalent, since the VQC dont have the same hidden layers architecture, and neither have natural non linearities, unless a measurement is performed. And theres no simple rule to convert any NN to a VQC or vice versa. Some now prefer to compare VQC to Kernel Methods [30].

We now have a better understanding of the advantages and weaknesses of the two main strategies towards quantum machine learning. Current research is now focused on two aspects:

Finally, and most importantly, improve the quantum devices! We all hope for constant incremental improvements or a paradigm shift in the quality of the qubits, their number, the error correction process, to reach powerful enough machines. Please physicists, can you hurry?

PS: lets not forget to use all this amazing science to do good things that will benefit everyone.

Jonas Landman is a Ph.D. student at the University of Paris under the supervision of Prof. Iordanis Kerenidis. He is Technical Advisor at QC Ware and member of QuantX. He has previously studied at Ecole Polytechnique and UC Berkeley.

Read more:
How and when quantum computers will improve machine learning? - Medium

Quantum computing: Honeywell just quadrupled the power of its computer – ZDNet

The System Model H1, a ten-qubit quantum computer, has reached a quantum volume of 512.

Honeywell's quantum scientists have quadrupled the capabilities of the company's quantum computer, with the device achieving record levels of performance less than a year after the first generation of the system was released.

The System Model H1, a ten-qubit quantum computer, effectively reached a quantum volume of 512 four times as much as was attained in the previous tweak of the system, which saw the H1 reach a quantum volume of 128.

Released commercially last June (at the time as the System Model H0), the H1 makes use of trapped ions, unlike IBM and Google's devices, which are built with superconducting qubits. Honeywell's new record is eight times as much as was achieved with the System Model H0, which launched with a quantum volume of 64.

Quantum volume is a concept that IBM developed in 2017 as a way of measuring various aspects of a quantum computer's performance; in simple terms, the higher the quantum volume, the higher the potential for resolving real-world problems across industry and research. Designed to be independent of the architecture of any given quantum computer, quantum volume can measure any system that runs quantum circuits.

SEE: Hiring Kit: Computer Hardware Engineer (TechRepublic Premium)

For example, one measurement that is indicative of a quantum computer's capabilities is qubit fidelity, which is critical to understanding how well a device can implement quantum code. According to Honeywell, the average single-qubit gate fidelity in the latest version of the H1 was 99.991%.

The final number that determines quantum volume is an aggregate of many other measurements and tests of a single quantum system's operations: they include the number of physical qubits in the quantum computer, but also the device's error rate, and connectivity, which reflects the extent to which qubits can be fully connected to each other within the device.

This is why it is possible for a quantum system to reach a high quantum volume, even with few qubits. Despite having only ten qubits, for instance, Honeywell's System Model H1 performs well when it comes to error rates and connectivity, which has earned the device a top spot for its overall capabilities. In comparison,last year IBM's 27-qubit client-deployed system achieved a quantum volume of 64.

The new milestone, therefore, hasprompted Honeywell's president of quantum solutions Tony Uttley to describethe System Model H1 as "the highest performing quantum computing system in the world."

Honeywell has made no secret of its strategy, which consists of focusing on qubit fidelity and connectedness, before attempting to scale up the number of qubits. "When you hear about fidelity and error, that's about the quality of the quantum operation," Uttley told ZDNet. "It's about knowing how often you get the right answer when you run these quantum algorithms."

"We have taken one approach that is very unique when it comes to how to get the most out of these near-term systems," he continued. "Nobody is talking about millions of qubits right now we're talking about tens of qubits. To get the most out of these tens of qubits, you have to have super-high fidelity, fully-connected and highly-controlled systems. That's our approach."

SEE: The EU wants to build its first quantum computer. That plan might not be ambitious enough

Making these highly reliable systems available to Honeywell's customers now enables businesses to test and trial with small-scale applications while waiting for the company to design and build new generations of more capable quantum computers, according to Uttley.

Honeywell recently introduced the first subscription-based plan for the usage of the H1,which grants paying customers a monthly access to the machine.

With only ten qubits, there is little that the device can achieve on top of proofs of concepts, designed to be implemented in full scale once a larger computer is available; but high-profile customers are nevertheless flocking to Honeywell's services.

J.P. Morgan Chase, for example, is investigating how the company's quantum computermight improve operations in banking; and BMW ispiloting the use of Honeywell's hardwareto optimize supply chains for car manufacturing.

Read this article:
Quantum computing: Honeywell just quadrupled the power of its computer - ZDNet

Five worthy reads: Understanding quantum computing and its impact on cybersecurity – Security Boulevard

Five worthy reads is a regular column on five noteworthy items we discovered while researching trending and timeless topics. In this weeks edition, lets explore how quantum computing works and how it impacts cybersecurity.

Quantum physics describes the behavior of atoms, and fundamental particles like electrons and photons. A quantum computer operates by controlling the behavior of these particles. Bits are the smallest units of information in traditional computers. Quantum computers use qubits, which can also be set simultaneously to one of two values, providing superior computing power. To visualize the difference, think of flipping a coin versus spinning it. This unpredictability is called superposition, which can be measured by electron motion and direction. Unlike bits, qubits are manipulated using quantum mechanics for data transfers, and not for data storage.

The quantum entanglement is what makes it really exciting. A close connection of qubits reacts to a change in the partner qubits state instantaneously, no matter how far apart they are. The transmission of information from one location to another without physically transmitting it almost imitates teleportation. When you change a molecular property of one particle, it can impact the other across space and time and that creates the channel for teleportation. Which Einstein once called this behavior spukhafte Fernwirkung which translates as Spooky action at a distance.

The fascinating thing about quantum tech is its uncertainty. This could be helpful for creating private keys to encrypt messages, which makes it impossible for hackers to copy the keys perfectly. However, a quantum computer can introduce other concerns. It could be used in codebreaking that potentially compromises IT security.

Here are five interesting reads on quantum computing and its impact on cybersecurity:

Quantum Computing May Be Closer Than You Think

Classical computers will not be replaced by quantum computers. Quantum computers are for solving problems which traditional computers can not. They can help performa large number of algorithms, calculations, and even run simulations. For example, vaccine development can be achieved in hours or days where it might take several years with classical computers. Quantum technology is capable of opening up a whole new world of possibilities.

Quantum Computing and the evolving cybersecurity threat

Many underlying foundational technologies that rely on public key encryption are potentially at risk with the advent of quantum technology.Quantum computers are a double-edged sword that can break our current encryption algorithms,but also open the door for more advanced systems. It can help various industries,including transportation in optimizing routes, finance industries in performing risk analysis, genetic engineering, chemical manufacturing and drug development, and weather forecasting.

Quantum computers could crack Bitcoin by 2022

Quantum computers can be popular in terms of codebreaking, its capabilities can potentially introduce IT security issues. Encrypting doesnt guarantee protection, its only a way to make the data harder to access. With a private key, one can easily create its corresponding public key, but not vice-versa. It could take millions of years for classical computers to find a match, but a quantum computer can easily calculate the secret private key in minutes. This means that cryptocurrency, like Bitcoins that depend on blockchain technology, are at greater risk of quantum attacks.

A sufficiently powered quantum computer can make modern-day encryption look like a side quest in the hackers main gameplay.Developing quantum-resistant cryptography to thwart quantum hacking is the need of the hour.

The quantum computing cybersecurity threat cannot be underestimated

Quantum computing opens up incredible advances in computing, such as the ability to factor large prime numbers at incredible speeds.Unfortunately, the same prime factor numbering underlies the security systems we use to secure data in transit and in other information security arenas.

Building a quantum computer and achieving quantum supremacy is not childs play.It involves huge investments, and carefully shielded, isolated environments operating at supercold temperatures. The quantum race is real, and many countries have been investing heavily in quantum computing.We should also be mindful about the harvest now, decrypt later attacks where an adversary can steal high-value encrypted data now and store to decrypt it later, once they gain access to a powerful quantum computer.

Harvesting Attacks & the Quantum Revolution

The quantum revolution has already begun. Organizations should start thinking about best practices like crypto-agility, which is the process that enables an organization to replace traditional algorithms without having an impact on any other process in the organization.They should consider quantum-resistant cryptography, as the existing encryption protocols will become obsolete in a few years. This may not seem like an immediate risk, but given the challenges and potential need for mitigation surrounding new protocols,planning ahead is wise. It may take a few more years for the technology to be commercially available, but we should also remember that a few years back quantum computing seemed like a theoretical concept.

The post Five worthy reads: Understanding quantum computing and its impact on cybersecurity appeared first on ManageEngine Blog.

*** This is a Security Bloggers Network syndicated blog from ManageEngine Blog authored by Sree Ram. Read the original post at: https://blogs.manageengine.com/corporate/general/2021/03/12/five-worthy-reads-understanding-quantum-computing-and-its-impact-on-cybersecurity.html

Follow this link:
Five worthy reads: Understanding quantum computing and its impact on cybersecurity - Security Boulevard

After merger, College Park startup IonQ plans to go public with $2 billion valuation – The Diamondback

IonQ, a quantum computing startup born in College Park, announced Monday that it would likely soon become the first publicly traded company to specialize in commercialized quantum computing.

The company plans to file paperwork with the Securities Exchange Commission in the next week, which will allow it to go public on the New York Stock Exchange through an acquisition deal that would set the valuation of the combined entity to nearly $2 billion.

The ability to become a public company gives us access to a huge capital base, and that will allow us to spend more time building our system, deploying them for useful application, said Chris Monroe, IonQs founder and a physics professor at the University of Maryland. We can start to do our own research and development We can do more risky things.

Monroe and co-founder Junsang Kim formed IonQ with the goal of taking quantum computing into the market. They initially received $2 million in seed funding from New Enterprise Associates, giving them a license to lab technology from the University of Maryland and Duke University. From there, they were able to raise tens of millions of dollars in funding from companies like Samsung and Mubadala, and partnered with Amazon Web Services and Microsoft.

[Gov. Hogan names College Park quantum computing company one of top state start-ups]

The company going public was made possible by a planned merger with a blank-check firm, dMY Technology Group Inc. III.

If it goes through, the merger will result in over $650 million in gross proceeds, including $350 million from private investors, according to a press release from IonQ. Combined with the $84 million the company has raised in venture capital funding, the deal would place IonQs total earnings at about $734 million.

The transition to quantum computing is unprecedented, Monroe said, and it will allow people to solve problems that a regular computer often cant.

Some problems like optimizing a fleet of trucks or discovering medicines have too many variables to solve with regular computing. But at the quantum level, more information can be handled, Monroe said, making it radically different from todays computing.

University President Darryll Pines, formerly the dean of the engineering school, explained that classical computing uses a stream of electrical pulses called bits, which represent 1s and 0s, to store information. However, on the quantum scale, subatomic particles known as qubits are used to store information, greatly increasing the speed of computing.

IonQs approach to researching quantum computing has been rooted in university-led research. Quantum physics has strange rules that arent always accepted in the engineering world, Monroe said, so many of these laws have become the domain of research at universities and national laboratories.

And this university especially, with its proximity to Washington, D.C., has one of the biggest communities of quantum scientists, Monroe said.

We have students and postdocs and all kinds of researchers on Marylands campus studying the field, and at IonQ, weve hired many of them, Monroe said. And thats a huge advantage for us.

As a company with about 60 employees, some of whom attended this university, IonQ has become a pioneer in quantum computing. In October, Peter Chapman, IonQs CEO and president, announced the companys newest 32-qubit computer, the most powerful quantum computer on the market.

And in November, Maryland Gov. Larry Hogan named IonQ one of the states top 20 startup companies in the state.

[Women of color in UMD community are making it as entrepreneurs despite challenges]

The biggest advantage for IonQ has been its technology, Monroe said. Companies like IBM, Google or Microsoft use silicon to build their computers but IonQ uses individual atoms, which, unlike silicon, float over a chip in a vacuum chamber.

That technology has been perfected at this university, Monroe said, and IonQ has a concrete plan over the next five years to manufacture quantum computer modules and wire them together.

By 2030, 20 percent of global organizations whether in the public or private sector are expected to budget for quantum-computing projects, according to Gartner Inc., a global research and advisory firm. That number is up from less than 1 percent in 2018, according to Gartner.

Niccolo de Masi, CEO of dMY, said in IonQs press release that he expects the quantum computing industry to grow immensely in the next ten years, with a market opportunity of approximately $65 billion by 2030.

Pines expressed his excitement at seeing a university startup make strides in computing.

Were happy for building the ecosystem from science, to translation, to startup, to possibly developing a product and adding value to society and growing jobs in the state of Maryland, Pines said.

The rest is here:
After merger, College Park startup IonQ plans to go public with $2 billion valuation - The Diamondback