Archive for the ‘Quantum Computer’ Category

Quantum Computing for the Future Grid – Transmission & Distribution World

The electric power grid is undergoing unprecedented change. This change is due to decarbonization efforts, increased reliance on renewable and variable generation resources, the integration of distributed energy resources, and transportation electrification. In turn, these changes have required electric utilities to expand their monitoring and measurement efforts through metering infrastructure and distribution automation initiatives. All these efforts have resulted in the collection of mountains of data from the electric grid. While this significant increase in data collection enables better monitoring of the grid and enhanced decision making, we still need a robust computational foundation that can convert all this collected big data into actionable information.

As mathematical challenges increase and data becomes core to modern utility decision-making, our industry needs to make progress and draw from emerging analytics and computing technologies. Quantum computing is a ground-breaking information processing technology that can support efforts to address power system challenges and enable the grid of the future. Given the promising applications to the power grid, this is an area of research that has really caught my attention lately. While quantum computing applications to the power grid have remained mostly unexamined, forward-looking utilities are exploring the next step to enhance these analytics by understanding how emerging quantum computing technologies can be leveraged to provide higher service levels.

Building the future grid will require an overall view of the quantum computing technology applications in power systems, such as the dynamic interaction of the transmission and distribution systems. According to a recent IEEE article by Rozhin Eskandarpour and a team of researchers from the University of Denver Electrical and Computing Engineering Department, current computational technologies might not be able to adequately address the needs of the future grid.

The most notable change is observed in the role of the distribution grid and customers in system design and management. Transmission and distribution systems were frequently operated as distinct systems but are becoming more of an integrated system. The underlying hypothesis was that at the substation, the transmission system would supply a prescribed voltage, and the distribution system will supply the energy to individual customers. However, as various types of distributed energy resources, including generation, storage, electric vehicles, and demand response, are integrated into the distribution network, there may be distinct interactions between the transmission and distribution systems. Distributed generations transient and small-signal stability problems are one instance that changes the energy systems dynamic nature. Therefore, developing more comprehensive models that include the dynamic relationships between transmission and distribution systems, and relevant computational tools that can solve such models will be essential in the future. Furthermore, better scheduling models are needed to design viable deployment and use of distributed energy resources.

Eskandarpour et al. describe other potential quantum computing applications for the power grid, including optimization, planning, and logistics; forecasting; weather prediction; wind turbine design; cybersecurity; grid security; and grid stability.

Given that I am both professionally embedded in covering the newest innovations within the power sector and nearing the end of a Ph.D. program at the University of Denver, it is not particularly surprising that a new university-industry research consortium has caught my attention. I am excited to share about this ground-breaking initiative and its potential role in building the future grid.

The University of Denver, in collaboration with various utilities, has established a consortium related to envisioning the quantum upgraded electric system of tomorrow. QUEST is the clever acronym that has been adopted for this university-industry consortium. The consortium aims to enhance university-industry collaborations to solve emerging challenges in building the future grid by utilizing quantum information and quantum computation. The consortium will develop new quantum models, methodologies, and algorithms to solve a range of grid problems faster and more accurately. Topics of interest include:

Industry members financially support the QUEST consortium, and membership is voluntary and open to any public or private organization active in the power and energy industry. For more information, contact Dr. Amin Khodaei at the University of Denver, School of Engineering and Computer Science.

Link:
Quantum Computing for the Future Grid - Transmission & Distribution World

Researchers develop innovative platform capable of verifying quantum encryption technologies – Aju Business Daily

[Courtesy of the Electronics and Telecommunications Research Institute]

The Electronics and Telecommunications Research Institute (ETRI) said that the new platform called "Q Crypton," unveiled at an online conference on July 21, can verify the quantum safety of various cryptographic systems such as RSA, which is a public-key cryptosystem that is widely used for securing data transmission, and next-generation quantum-resistant passwords.

Q Crypton laid the foundation for verifying cryptographic algorithms and the performance of programs that will be used in quantum computers, ETRI said, adding the platform would be released step by step through a web browser to prevent hacking using quantum computers.

"The fear of incapacitating modern public key cryptography with quantum computers is coming to reality. Based on the world's best technology in cryptographic quantum safety, we will work hard to establish next-generation security infrastructure early," ETRI's cybersecurity research division head Kim Ik-kyun said in a statement on July 21.

Quantum cryptography is an essential security solution for safeguarding critical information. Data encoded in a quantum state is virtually unhackable without quantum keys which are basically random number tables used to decipher encrypted information. Binary digital electronic computers are based on transistors and capacitors with data encoded into binary digits (bits). Quantum computation uses quantum bits or qubits.

Q Crypton platform can analyze and simulate the quantitative safety of passwords more accurately as it can consider various factors such as different qubit sizes, quantum computer chip structures, and an error rate. Because the platform is equipped with visualization programming technology and a library of key computations for encryption, it is possible to develop quantum algorithms needed for encryption quickly and efficiently.

ETRI said the platform schematized quantum circuits so that numerous and complex formulas can be seen intuitively at a glance and shortened so that they are not inputted one by one. the platform provides the language processing of quantum algorithms, verification using virtual machines, and a function to analyze the amount of quantum resources.

Post-quantum cryptography (PQC) refers to cryptographic algorithms that are thought to be secure against an attack by a quantum computer. Even though current, publicly known, experimental quantum computers lack the processing power to break any real cryptographic algorithm, many cryptographers are designing new algorithms to prepare for a time when quantum computing becomes a threat.

Read more:
Researchers develop innovative platform capable of verifying quantum encryption technologies - Aju Business Daily

Red Hat embraces quantum supremacy as it looks to the future – SiliconANGLE News

Since its founding in 1993, Red Hat Inc. has seen significant growth and witnessed first hand the transformation from an analog to a digital economy.

With years of experience under its belt, Red Hat is looking on the horizon to prepare for emerging technology with its partnership with IBM Corp., giving it a front-row seat to technological progress. The software company employs a variety of experts across different departments to maintain the massive overhead of running a large tech business.

We typically organize our teams around horizontal technology sectors, said Stephen Watt (pictured, right), distinguished engineer and head of emerging technologies at Red Hat. I have an edge team, cloud networking team, a cloud storage team, application platforms team. Weve got different areas that we attack work and opportunities, but the good ideas can come from a variety of different places, so we try and leverage co-creation with our customers and our partners.

Watt, along with Parul Singh (pictured, left), senior software engineer at Red Hat, and Luke Hinds (pictured, middle), senior software engineer at Red Hat, spoke with John Furrier, host of theCUBE, SiliconANGLE Medias livestreaming studio, during the recentRed Hat Summit. They discussed quantum supremacy, how Red Hat manages its consumers needs, signature server and more.(* Disclosure below.)

One of the many new technologies emerging is quantum computing, which uses qubits instead of bits and is able to process an exponential amount of data compared to its older counterpart.

Quantum computers are evolving, and they have been around, but right now you see that they are going to be the next thing, Singh said. We define quantum supremacy as, say you have any program that you run or any problem that you solve on a classical computer, a quantum computer would be giving you the results faster.

Because quantum computers are not as easily accessible as classical computers, Red Hat has sought out a solution that combines OpenShifts classical components with quantum computing, taking the results and integrating them into classical workloads.

Signature server, or sigstore, is an umbrella organization containing various open-source projects.

Sigstore will enable developers to sign software artifacts, bills and materials, containers, binaries, all of these different artifacts that are part of a software supply chain, Hinds said. Its very similar to a blockchain. It allows you to have cryptographic-proof auditing of our software supply chain, and weve made sigstore so that its easy to adopt, because traditional cryptographic signing tools are a challenge for a lot of developers to implement in their open-source projects.

Open-source boasts the advantage of being transparent, allowing everyone to see the code with no hidden surprises or security issues lurking underneath. Another advantage of open-source software is agency, according to Watt.

If youre waiting on a vendor to go do something, if its proprietary software, you dont have much agency to get that vendor to go do that thing. Whereas the open source, if youre tired of waiting around, you can just submit the patch, he said. So people can then go and take sigstore, run it as a smaller internal service. Maybe they discover a bug. They can fix that bug, contribute it back to the operationalizing piece, as well as the traditional package software, to make it a much more robust and open service. So you bring that transparency and the agency back to the software-as-a-service model as well.

Watch the complete video interview below, and be sure to check out more of SiliconANGLEs and theCUBEs coverage of Red Hat Summit. (* Disclosure: TheCUBE is a paid media partner for Red Hat Summit. Neither Red Hat Inc., the sponsor for theCUBEs event coverage, nor other sponsors have editorial control over content on theCUBE or SiliconANGLE.)

We are holding our second cloud startup showcase on June 16.Click here to join the free and open Startup Showcase event.

TheCUBEis part of re:Invent, you know,you guys really are a part of the eventand we really appreciate your coming hereand I know people appreciate thecontent you create as well Andy Jassy

We really want to hear from you. Thanks for taking the time to read this post. Looking forward to seeing you at the event and in theCUBE Club.

Link:
Red Hat embraces quantum supremacy as it looks to the future - SiliconANGLE News

Quantum Computing Is Coming. What Can It Do? – Harvard Business Review

Digital computing has limitations in regards to an important category of calculation called combinatorics, in which the order of data is important to the optimal solution. These complex, iterative calculations can take even the fastest computers a long time to process. Computers and software that are predicated on the assumptions of quantum mechanics have the potential to perform combinatorics and other calculations much faster, and as a result many firms are already exploring the technology, whose known and probable applications already include cybersecurity, bio-engineering, AI, finance, and complex manufacturing.

Quantum technology is approaching the mainstream. Goldman Sachs recently announced that they could introduce quantum algorithms to price financial instruments in as soon as five years. Honeywell anticipates that quantum will form a $1 trillion industry in the decades ahead. But why are firms like Goldman taking this leap especially with commercial quantum computers being possibly years away?

To understand whats going on, its useful to take a step back and examine what exactly it is that computers do.

Lets start with todays digital technology. At its core, the digital computer is an arithmetic machine. It made performing mathematical calculations cheap and its impact on society has been immense. Advances in both hardware and software have made possible the application of all sorts of computing to products and services. Todays cars, dishwashers, and boilers all have some kind of computer embedded in them and thats before we even get to smartphones and the internet. Without computers we would never have reached the moon or put satellites in orbit.

These computers use binary signals (the famous 1s and 0s of code) which are measured in bits or bytes. The more complicated the code, the more processing power required and the longer the processing takes. What this means is that for all their advances from self-driving cars to beating grandmasters at Chess and Go there remain tasks that traditional computing devices struggle with, even when the task is dispersed across millions of machines.

A particular problem they struggle with is a category of calculation called combinatorics. These calculations involve finding an arrangement of items that optimizes some goal. As the number of items grows, the number of possible arrangements grows exponentially. To find the best arrangement, todays digital computers basically have to iterate through each permutation to find an outcome and then identify which does best at achieving the goal. In many cases this can require an enormous number of calculations (think about breaking passwords, for example). The challenge of combinatorics calculations, as well see in a minute, applies in many important fields, from finance to pharmaceuticals. It is also a critical bottleneck in the evolution of AI.

And this is where quantum computers come in. Just as classical computers reduced the cost of arithmetic, quantum presents a similar cost reduction to calculating daunting combinatoric problems.

Quantum computers (and quantum software) are based on a completely different model of how the world works. In classical physics, an object exists in a well-defined state. In the world of quantum mechanics, objects only occur in a well-defined state after we observe them. Prior to our observation, two objects states and how they are related are matters of probability.From a computing perspective, this means that data is recorded and stored in a different way through non-binary qubits of information rather than binary bits, reflecting the multiplicity of states in the quantum world. This multiplicity can enable faster and lower cost calculation for combinatoric arithmetic.

If that sounds mind-bending, its because it is. Even particle physicists struggle to get their minds around quantum mechanics and the many extraordinary properties of the subatomic world it describes, and this is not the place to attempt a full explanation. But what we can say is quantum mechanics does a better job of explaining many aspects of the natural world that classical physics does, and it accommodates nearly all of the theories that classical physics has produced.

Quantum translates, in the world of commercial computing, to machines and software that can, in principle, do many of the things that classical digital computers can and in addition do one big thing classical computers cant: perform combinatorics calculations quickly. As we describe in our paper, Commercial Applications of Quantum Computing, thats going to be a big deal in some important domains. In some cases, the importance of combinatorics is already known to be central to the domain.

As more people turn their attention to the potential of quantum computing, applications beyond quantum simulation and encryption are emerging:

The opportunity for quantum computing to solve large scale combinatorics problems faster and cheaper has encouraged billions of dollars of investment in recent years. The biggest opportunity may be in finding more new applications that benefit from the solutions offered through quantum. As professor and entrepreneur Alan Aspuru-Guzik said, there is a role for imagination, intuition, and adventure. Maybe its not about how many qubits we have; maybe its about how many hackers we have.

See original here:
Quantum Computing Is Coming. What Can It Do? - Harvard Business Review

Startup hopes the world is ready to buy quantum processors – Ars Technica

Early in its history, computing was dominated by time-sharing systems. These systems were powerful machines (for their time, at least) that multiple users connected to in order to perform computing tasks. To an extent, quantum computing has repeated this history, with companies like Honeywell, IBM, and Rigetti making their machines available to users via cloud services. Companies pay based on the amount of time they spend executing algorithms on the hardware.

For the most part, time-sharing works out well, saving companies the expenses involved in maintaining the machine and its associated hardware, which often includes a system that chills the processor down to nearly absolute zero. But there are several customerscompanies developing support hardware, academic researchers, etc.for whom access to the actual hardware could be essential.

The fact that companies aren't shipping out processors suggests that the market isn't big enough to make production worthwhile. But a startup from the Netherlands is betting that the size of the market is about to change. On Monday, a company called QuantWare announced that it will start selling quantum processors based on transmons, superconducting loops of wire that form the basis of similar machines used by Google, IBM, and Rigetti.

Transmon-based qubits are popular because they're compatible with the standard fabrication techniques used for more traditional processors; they can also be controlled using microwave-frequency signals. Their big downside is that they operate only at temperatures that require liquid helium and specialized refrigeration hardware. These requirements complicate the hardware needed to exchange signals between the very cold processor and the room-temperature hardware that controls it.

Startup companies like D-Wave and Rigetti have set up their own fabrication facilities, but MatthijsRijlaarsdam, one of QuantWare's founders, told Ars that his company is taking advantage of an association with TU Delft, the host of the Kavli Nanolab. This partnership lets QuantWare do the fabrication without investing in its own facility. Rijlaarsdam said the situation shouldn't be a limiting factor, since he expects that the total market likely won't exceed tens of thousands of processors over the entirety of the next decade. Production volumes don't have to scale dramatically.

The initial processor the company will be shipping contains only five transmon qubits. Although this is well below anything on offer via one of the cloud services, Rijlaarsdam told Ars that the fidelities of each qubit will be 99.9 percent, which should keep the error rate manageable. He argued that, for now, a low qubit count should be sufficient based on the types of customers QuantWare expects to attract.

These customers include universities interested in studying new ways of using the processor and companies that might be interested in developing support hardware needed to turn a chip full of transmons into a functional system. Intel, for example, has been developing transmon hardware control chips that can tolerate the low temperatures required (although the semiconductor giant can also easily make its own transmons as needed).

That last aspectdeveloping a chip around which others could build a platformfeatures heavily in the press release that QuantWare shared with Ars. The announcement makes frequent mention of the Intel 4004, an early general-purpose microprocessor that found a home in a variety of computers.

Rijlaarsdam told Ars that he expects the company to increase its qubit count by two- to four-fold each year for the next few years. That's good progress, but it will still leave the company well behind the roadmap of competitors like IBM for the foreseeable future.

Rijlaarsdam also suggested that quantum computing will reach what he called "an inflection point" before 2025. Once this point is reached, quantum computers will regularly provide answers to problems that can't be practically calculated using classical hardware. Once that point is reached, "the market will be a multibillion-dollar market," Rijlaarsdam told Ars. "It will also grow rapidly, as the availability of large quantum computers will accelerateapplication development."

But if that point is reached before 2025, it will arrive at a time when QuantWare's qubit count is suited for the current market, which he accurately described as "an R&D market." QuantWare's solution to the awkward timing will be to develop quantum processors specialized for specific algorithms, which can presumably be done using fewer qubits. But those won't be aren't available for the company's launch.

Obviously, it's debatable whether there's a large market of companies anxiously awaiting the opportunity to install liquid helium dilution refrigerators in their office/lab/garage. But the reality is that there is almost certainly some market for an off-the-shelf quantum processorat least partly composed of other quantum computing startups.

That's not quite equivalent to the situation that greeted the Intel 4004. But it may be significant in that we seem to be getting close to the point where some of Ars' quantum-computing coverage will need to move out of the science section and over to IT, marking a clear shift in how the field is developing.

Listing image by QuantWare

See the article here:
Startup hopes the world is ready to buy quantum processors - Ars Technica