Archive for the ‘Quantum Computing’ Category

Quantum Computing Is Hot And Noisy, But Zapata Opens Early Access – Forbes

Zapata's quantum coders, ready for a hot & noisy ride.

Were on the road to quantum computing. But these massively powerful machines are still in somewhat embryonic prototype stages and we still have several key challenges to overcome before we can start to build more of them.

As a quantum reminder: traditional computers compute on the basis of binary 1s and 0s, so all values and mathematical logic are essentially established from a base of those two values quantum superposition particles (known as qubits) can be 1 or 0, or anywhere in between and the value expressed can be differentiated depending upon what angle the qubit is viewed from so with massively more breadth, we can create a lot more algorithmic logic and computing power.

One of the main challenges associated with building quantum computing machines is the massive heat they generate. Scientists have been working with different semiconducting materials such as so-called quantum dots to help overcome the heat challenge. This issue is that qubits are special, qubits are powerful, but qubits are also fragile... and heat is one of their sworn enemies.

Another core challenge is noise.

As computations pass through the quantum gates that make up the quantum circuits in our new super quantum machines they create a lot of noise disturbance (think of an engine revving louder as it speeds up), so this means we have come to define and accept the term NISQ-based quantum applications i.e. Noisy Intermediate-Scale Quantum (NISQ).

As beautifully clarified by theoretical physicist John Preskill in this 2018 paper, Noisy Intermediate-Scale Quantum (NISQ) technology will be available in the near future. Quantum computers with 50-100 qubits may be able to perform tasks which surpass the capabilities of todays classical digital computers, but noise in quantum gates will limit the size of quantum circuits that can be executed reliably. Quantum technologists should continue to strive for more accurate quantum gates and, eventually, fully fault-tolerant quantum computing.

The fact that we know about the heat and noise challenges hasnt stopped companies like Strangeworks, D-Wave Systems, Coldquanta and others (including usual suspects Intel, IBM and Microsoft) forging on with development in the quantum space. Joining that list is Boston-headquartered Zapata Computing, Inc. The company describes itself as the quantum software company for near-term/NISQ-based quantum applications empowering enterprise teams. Near-term in this case meaning, well, now i.e. quantum stuff we can actually use on quantum devices of about 100-300 qubits.

Zapatas latest quantum leap in quantum (pun absolutely intended) is an early access program to Orquestra, its platform for quantum-enabled workflows. The company claims to have provided a software- and hardware-interoperable enterprise quantum toolset i.e. again, quantum tools we can actually use in modern day enterprise IT departments.

Using Zapatas unified Quantum Operating Environment, users can build, run and analyze quantum and quantum-inspired workflows. This toolset will empower enterprises and institutions to make their quantum mark on the world, enabling them to develop quantum capabilities and foundational IP today while shoring up for derivative IP for tomorrow, says CEO Christopher Savoie. It is a new computing paradigm, built on a unified enterprise framework that spans quantum and classical programming and hardware tools. With Orquestra, we are accelerating quantum experiments at scale.

Zapatas Early Access Program to Orquestra is aimed at users with backgrounds in software engineering, machine learning, physics, computational chemistry or quantum information theory working on the most computationally complex problems.

Orquestra is agnostic across the entire software and hardware stack. It offers an extensible library of open source and Zapata-created components for writing, manipulating and optimizing quantum circuits and running them across quantum computers, quantum simulators and classical computing resources. It comes equipped with a versatile workflow system and Application Programming Interfaces (APIs) to connect all modes of quantum devices.

We developed Orquestra to scale our own work for our customers and then realized the quantum community needs it, too. Orquestra is the only system for managing quantum workflows, said Zapata CTO Yudong Cao. The way we design and deploy computing solutions is changing. Orquestras interoperable nature enables extensible and modular implementations of algorithms and workflows across platforms and unlocks fast, fluid repeatability of experiments at scale.

So were on a journey. The journey is the road from classical-to-quantum and the best advice is to insist upon an interoperable vehicle (as Zapata has provided here) and to take a modular and extensible approach. In car analogy theory, that would mean break your journey up into bite-size chunks and make sure you have enough gas for the long haul when it comes. The quantum software parallel is obvious enough not to even explain.

Even when quantum evolves to become more ubiquitously available, many people think it will still be largely delivered as a cloud computing Quantum-as-a-Service (QaaS) package, but understanding the noisy overheated engine room in the meantime makes for a fascinating movie preview.

Excerpt from:
Quantum Computing Is Hot And Noisy, But Zapata Opens Early Access - Forbes

Google’s Head of Quantum Computing Hardware Resigns – WIRED

In late October 2019, Google CEO Sundar Pichai likened the latest result from the companys quantum computing hardware lab in Santa Barbara, California, to the Wright brothers first flight.

One of the labs prototype processors had achieved quantum supremacyevocative jargon for the moment a quantum computer harnesses quantum mechanics to do something seemingly impossible for a conventional computer. In a blog post, Pichai said the milestone affirmed his belief that quantum computers might one day tackle problems like climate change, and the CEO also name-checked John Martinis, who had established Googles quantum hardware group in 2014.

Heres what Pichai didnt mention: Soon after the team had first got its quantum supremacy experiment working a few months earlier, Martinis says, he had been reassigned from a leadership position to an advisory one. Martinis tells WIRED that the change led to disagreements with Hartmut Neven, the longtime leader of Googles quantum project.

Martinis resigned from Google early this month. Since my professional goal is for someone to build a quantum computer, I think my resignation is the best course of action for everyone, he adds.

A Google spokesman did not dispute this account, and says that the company is grateful for Martinis contributions and that Neven continues to head the companys quantum project. Parent company Alphabet has a second, smaller, quantum computing group at its X Labs research unit. Martinis retains his position as a professor at the UC Santa Barbara, which he held throughout his tenure at Google, and says he will continue to work on quantum computing.

Googles quantum computing project was founded by Neven, who pioneered Googles image search technology, in 2006, and initially focused on software. To start, the small group accessed quantum hardware from Canadian startup D-Wave Systems, including in collaboration with NASA.

Everything you ever wanted to know about qubits, superpositioning, and spooky action at a distance.

The project took on greater scale and ambition when Martinis joined in 2014 to establish Googles quantum hardware lab in Santa Barbara, bringing along several members of his university research group. His nearby lab at UC Santa Barbara had produced some of the most prominent work in the field over the past 20 years, helping to demonstrate the potential of using superconducting circuits to build qubits, the building blocks of quantum computers.

Qubits are analogous to the bits of a conventional computer, but in addition to representing 1s and 0s, they can use quantum mechanical effects to attain a third state, dubbed a superposition, something like a combination of both. Qubits in superposition can work through some very complex problems, such as modeling the interactions of atoms and molecules, much more efficiently than conventional computer hardware.

How useful that is depends on the number and reliability of qubits in your quantum computing processor. So far the best demonstrations have used only tens of qubits, a far cry from the hundreds or thousands of high quality qubits experts believe will be needed to do useful work in chemistry or other fields. Googles supremacy experiment used 53 qubits working together. They took minutes to crunch through a carefully chosen math problem the company calculated would take a supercomputer on the order of 10,000 years, but does not have a practical application.

Martinis leaves Google as the company and rivals that are working on quantum computing face crucial questions about the technologys path. Amazon, IBM, and Microsoft, as well as Google offer their prototype technology to companies such as Daimler and JP Morgan so they can run experiments. But those processors are not large enough to work on practical problems, and it is not clear how quickly they can be scaled up.

When WIRED visited Googles quantum hardware lab in Santa Barbara last fall, Martinis responded optimistically when asked if his hardware team could see a path to making the technology practical. I feel we know how to scale up to hundreds and maybe thousands of qubits, he said at the time. Google will now have to do it without him.

More Great WIRED Stories

See more here:
Google's Head of Quantum Computing Hardware Resigns - WIRED

Will Quantum Computing Really Change The World? Facts And Myths – Analytics India Magazine

In recent years, some big tech companies like IBM, Microsoft, Intel, or Google have been working in relative silence on something that sounds great: quantum computing. The main problem with this is that it is difficult to know what exactly it is and what it can be useful for.

There are some questions that can be easily solved. For example, quantum computing is not going to help you have more FPS on your graphics card at the moment. Nor will it be as easy as changing the CPU of your computer for a quantum to make it hyperfast. Quantum computing is fundamentally different from the computing we are used to, but how?

At the beginning of the 20th century, Planck and Einstein proposed that light is not a continuous wave (like the waves in a pond) but that it is divided into small packages or quanta. This apparently simple idea served to solve a problem called the ultraviolet catastrophe. But over the years other physicists developed it and came to surprising conclusions about the matter, of which we will be interested in two: the superposition of states and entanglement.

To understand why we are interested, lets take a short break and think about how a classic computer works. The basic unit of information is the bit, which can have two possible states (1 or 0) and with which we can perform various logical operations (AND, NOT, OR). Putting together n bits we can represent numbers and operate on those numbers, but with limitations: we can only represent up to 2 different states, and if we want to change x bits we have to perform at least x operations on them: there is no way to magically change them without touching them.

Well, superposition and entanglement allow us to reduce these limitations: with superposition, we can store many more than just 2 ^ n states with n quantum bits (qubits), and entanglement maintains certain relations between qubits in such a way that the operations in one qubit they forcefully affect the rest.

Overlapping, while looking like a blessing at first glance, is also a problem. As Alexander Holevo showed in 1973, even though we have many more states than we can save in n qubits, in practice we can only read 2 ^ n different ones. As we saw in an article in Genbeta about the foundations of quantum computing: a qubit is not only worth 1 or 0 as a normal bit, but it can be 1 in 80% and 0 in 20%. The problem is that when we read it we can only obtain either 1 or 0, and the probabilities that each value had of leaving are lost because when we measured it we modified it.

This discrepancy between the information kept by the qubits and what we can read led Benioff and Feynman to demonstrate that a classical computer would not be able to simulate a quantum system without a disproportionate amount of resources, and to propose models for a quantum computer that did. was able to do that simulation.

Those quantum computers would probably be nothing more than a scientific curiosity without the second concept, entanglement, which allows two quite relevant algorithms to be developed: quantum tempering in 1989 and Shors algorithm in 1994. The first allows finding minimum values of functions, which So said, it does not sound very interesting but it has applications in artificial intelligence and machine learning, as we discussed in another article. For example, if we manage to code the error rate of a neural network as a function to which we can apply quantum quenching, that minimum value will tell us how to configure the neural network to be as efficient as possible.

The second algorithm, the Shor algorithm, helps us to decompose a number into its prime factors much more efficiently than we can achieve on a normal computer. So said, again, it doesnt sound at all interesting. But if I tell you that RSA, one of the most used algorithms to protect and encrypt data on the Internet, is based on the fact that factoring numbers are exponentially slow (adding a bit to the key implies doubling the time it takes to do an attack by force) then the thing changes. A quantum computer with enough qubits would render many encryption systems completely obsolete.

Until now, quantum computing is a field that hasnt been applied much in the real world. To give us an idea, with the twenty qubits of the commercial quantum computer announced by IBM, we could apply Shors factorization algorithm only to numbers less than 1048576, which as you can imagine is not very impressive.

Still, the field has a promising evolution. In 1998 the first ord quantum drive (only two qubits, and needed a nuclear magnetic resonance machine to solve a toy problem (the so-called Deutsch-Jozsa problem). In 2001 Shors algorithm was run for the first time. Only 6 years later, in 2007, D-Wave presented its first computer capable of executing quantum quenching with 16 qubits. This year, the same company announced a 2000 qubit quantum quenching computer. On the other hand, the new IBM computers, although with fewer qubits, they are able to implement generic algorithms and not only that of quantum quenching. In short, it seems that the push is strong and that quantum computing will be increasingly applicable to real problems.

What can those applications be? As we mentioned before, the quantum tempering algorithm is very appropriate for machine learning problems, which makes the computers that implement it extremely useful, although the only thing they can do is run that single algorithm. If systems can be developed that, for example, are capable of transcribing conversations or identifying objects in images and can be translated to train them in quantum computers, the results could be orders of magnitude better than those that already exist. The same algorithm could also be used to find solutions to problems in medicine or chemistry, such as finding the optimal treatment methods for a patient or studying the possible structures of complex molecules.

Generic quantum computers, which have fewer qubits right now, could run more algorithms. For example, they could be used to break much of the crypto used right now as we discussed earlier (which explains why the NSA wanted to have a quantum computer). They would also serve as super-fast search engines if Grovers search algorithm can be implemented, and for physics and chemistry, they can be very useful as efficient simulators of quantum systems.

Unfortunately, algorithms and codes for classic computers couldnt be used on quantum computers and magically get an improvement in speed: you need to develop a quantum algorithm (not a trivial thing) and implement it in order to get that improvement. That, at first, greatly restricts the applications of quantum computers and will be a problem to overcome when those systems are more developed.

However, the main problem facing quantum computing is building computers. Compared to a normal computer, a quantum computer is an extremely complex machine: they operate at a temperature close to absolute zero (-273 C), the qubits support are superconducting and the components to be able to read and manipulate the qubits are not simple either.

What can a non-quantum quantum computer be like? As we have explained before, the two relevant concepts of a quantum computer are superposition and entanglement, and without them, there cannot be the speed improvements that quantum algorithms promise. If computer disturbances modify overlapping qubits and bring them to classical states quickly, or if they break the interweaving between several qubits, what we have is not a quantum computer but only an extremely expensive computer that only serves to run a handful of algorithms. equivalent to a normal computer (and will probably give erroneous results).

Of the two properties, entanglement is the most difficult to maintain and prove to exist. The more qubits there are, the easier it is for one of them to deinterlace (which explains why increasing the number of qubits is not a trivial task). And it is not enough to build the computer and see that correct results come out to say that there are intertwined qubits: looking for evidence of entanglement is a task in itself and in fact, the lack of evidence was one of the main criticisms of D-systems. Wave in its beginnings.

A priori and with the materials that quantum computers are being built with, it does not seem that miniaturization is too feasible. But there is already research on new materials that could be used to create more accessible quantum computers. Who knows if fifty years from now we will be able to buy quantum CPUs to improve the speed of our computers.

comments

Link:
Will Quantum Computing Really Change The World? Facts And Myths - Analytics India Magazine

Google’s top quantum computing brain may or may not have quit – Fudzilla

We will know when someone opens his office door

John Martinis, who had established Googles quantum hardware group in 2014, has cleaned out his office, put the cats out and left the building.

Martinis says a few months after he got Googles now legendary quantum computing experiment to go he was reassigned from a leadership position to an advisory one.

Martinis told Wired that the change led to disagreements with Hartmut Neven, the longtime leader of Googles quantum project.

Martinis said he had to go because his professional goal is for someone to build a quantum computer.

Google has not disputed this account, and says that the company is grateful for Martinis contributions and that Neven continues to head the companys quantum project.

Martinis retains his position as a professor at the UC Santa Barbara, which he held throughout his tenure at Google, and says he will continue to work on quantum computing.

To be fair, Googles quantum computing project was founded by Neven, who pioneered Googles image search technology, and got enough cats together.

The project took on greater scale and ambition when Martinis joined in 2014 to establish Googles quantum hardware lab in Santa Barbara, bringing along several members of his university research group. His nearby lab at UC Santa Barbara had produced some of the most prominent work in the field over the past 20 years, helping to demonstrate the potential of using superconducting circuits to build qubits, the building blocks of quantum computers.

Googles ground-breaking supremacy experiment used 53 qubits working together. They took minutes to crunch through a carefully chosen math problem the company calculated would take a supercomputer 10,000 years to work out. It still does not have a practical use, and the cats were said to be bored with the whole thing.

Read more here:
Google's top quantum computing brain may or may not have quit - Fudzilla

RMACC’s 10th High Performance Computing Symposium to Be Held Free Online – HPCwire

BOULDER, Colo., April 22, 2020 The Rocky Mountain Advanced Computing Consortium (RMACC) will hold its 10thannual High Performance Computing Symposium as a multi-track on-line version on May 20-21.Registration for the event will be free to all who would like to attend.

The on-line Symposium will include presentations by two keynote speakers and a full slate of tutorial sessions.Another longtime Symposium tradition a poster competition for students to showcase their own research also will be continued. Competition winners will receive an all-expenses paid trip to SC20 in Atlanta.

Major sponsor support is being provided by Intel, Dell and HPE with additional support from ARM, IBM, Lenovo and Silicon Mechanics.

Links to the Symposium registration, its schedule, and how to enter the poster competition can be found atwww.rmacc.org/hpcsymposium.

The Keynote speakers areDr.Nick Bronn, a Research Staff Member in IBMs Experimental Quantum Computing group, andDr. Jason Dexter, a working group coordinator for the groundbreaking black hole imaging studies published by Event Horizon Telescope.

Dr. Bronn serves at IBMs TJ Watson Research Center in Yorktown Heights, NY.He has been responsible for qubit (quantum bits) device design, packaging, and cryogenic measurement, working towards scaling up larger numbers of qubits on a device and integration with novel implementations of microwave and cryogenic hardware.He will speak on the topic,Benchmarking and Enabling Noisy Near-term Quantum Hardware.

Dr.Dexter is a member of the astrophysical and planetary sciences faculty at the University of Colorado Boulder.He will speak on the role of high performance computing in understanding what we see in the first image of a black hole.Dr. Dexter is a member of both the Event Horizon Telescope and VLTI/GRAVITY collaborations, which can now image black holes.

Their appearances along with the many tutorial sessions continue the RMACCs annual tradition of showcasing cutting-edge HPC achievements in both education and industry.

The largest consortium of its kind, the RMACC is a collaboration among 30 academic and government research institutions in Arizona, Colorado, Idaho, Montana, Nevada, New Mexico, Utah, Washington and Wyoming. The consortiums mission is to facilitate widespread effective use of high performance computing throughout the 9-state intermountain region.

More about the RMACC and its mission can be found at the website:www.rmacc.org.

About RMACC

Primarily a volunteer organization, the RMACC is collaboration among 30 academic and research institutions located in Arizona, Colorado, Idaho, Montana, Nevada, New Mexico, Utah, Washington and Wyoming.The RMACCs mission is to facilitate widespread effective use of high performance computing throughout this 9-state intermountain region.

Source: RMACC

See more here:
RMACC's 10th High Performance Computing Symposium to Be Held Free Online - HPCwire