Archive for the ‘Quantum Computer’ Category

Supply Chain: The Quantum Computing Conundrum | Logistics – Supply Chain Digital – The Procurement & Supply Chain Platform

From artificial intelligence to IoT, each technology trend is driven by finding solutions to a problem, some more successfully than others. Right now, the worlds technology community is focused on harnessing the exponential opportunities promised by quantum computing. While it may be some time before we see the true benefits of this emerging technology, and while nothing is certain, the possibilities are great.

What is Quantum Computing?

Capable of solving problems up to 100 million times faster than traditional computers, quantum computing has the potential to comprehensively speed up processes on a monumental scale.

Quantum computers cost millions of dollars to produce, so it perhaps goes without saying that these computers are not yet ready for mass production and rollout. However, their powerful potential to transform real-world supply chain problems should not (and cannot) be ignored. Quantum bits (qubits) can occupy more than one state at the same time (unlike their binary counterparts), embracing nuance and complexity. These particles are interdependent on each other and analogous to the variables of a complex supply chain. Qubits can be linked to other qubits, a process known as entanglement. This is a key hallmark that separates quantum from classical computing.

It is possible to adjust an interaction between these qubits so that they can sense each other. The system then naturally tries to arrange itself in such a way that it consumes as little energy as possible says Christoph Becher, a Professor in Experimental Physics at Saarland University.

Right now, tech giants such as Microsoft, IBM and Intel continue to lead the charge when it comes to the development of quantum computers. While continuous improvement will still be required in the years to come, many tech companies are already offering access to quantum computing features.

According to Forbes contributor Paul Smith-Goodson, IBM is committed to providing clients with quantum computing breakthroughs capable of solving todays impossible problems. Jay Gambetta, Vice President, IBM Quantum, said: With advancements across software and hardware, IBMs full-stack approach delivers the most powerful quantum systems in the industry to our users.

This is good news for multiple industries but in particular those areas of the supply chain where problems around efficiency occur.

Preventing Failure of Supply Chain Optimisation Engines

Current optimisation systems used in inventory allocation and order promising fail to meet the expectations of supply chain planners for a few reasons. Sanjeev Trehan, a member of the Enterprise Transformation Group at TATA Consultancy Services, highlighted two of the key reasons for this in a discussion around digital supply chain disruption:

Inadequate system performance capabilities lie at the heart of both planning problems. By speeding up these processes on an exponential scale, these problems are almost completely eradicated, and the process is made more efficient.

Practical Data and Inventory Applications

As manufacturers incorporate more IoT sensors into their daily operations, they harvest vast amounts of enterprise data. Quantum computing can handle these complex variables within a decision-making model with a high degree of excellence. Harmonising various types of data from different sources makes it especially useful for optimising resource management and logistics within the supply chain.

Quantum computing could be applied to improve dynamic inventory allocation, as well as helping manufacturers govern their energy distribution, water usage, and network design. The precision of this technology allows for a very detailed account of the energy used on the production floor in real-time, for example. Microsoft has partnered with Dubais Electricity and Water Authority in a real-life example of using quantum for grid and utility management.

Logistics

Quantum computing holds huge potential for the logistics area of the supply chain, says Shiraz Sidat, Operations Manager of Speedel, a Leicestershire based B2B courier firm that works in the supply chain of a number of aerospace and manufacturing companies.

Quantum offers real-world solutions in areas such as scheduling, planning, routing and traffic simulations. There are huge opportunities to optimise energy usage, create more sustainable travel routes and make more informed financially-savvy decisions. The sheer scale of speed-up on offer here could potentially increase sustainability while saving time and money he adds.

TATA Consultancy Services provide a very good example to support Shirazs statement.

Lets say a company plans to ship orders using ten trucks over three possible routes. This means the company has 310 possibilities or 59,049 solutions to choose from. Any classical computer can solve this problem with little effort. Now lets assume a situation where a transport planner wants to simulate shipments using 40 trucks over the same three routes. The possibilities, in this case, are approximately 12 Quintillion a tough ask for a classical computer. Thats where quantum computers could potentially come in.

Looking Ahead

Quantum computing has the potential to disrupt the planning landscape. Planners can run plans at the flick of a button, performing scenario simulations on the fly.

At present, the full use of quantum computers in the supply chain would be expensive and largely impractical. Another current issue is the higher rate of errors (when compared to traditional computers) experienced due to the excessive speed at which they operate. Experts and companies around the world are working to address and limit these errors.

As mentioned earlier in the article, many tech companies are providing aspects of quantum computing through an as-a-service model, which could well prove the most successful path for future widespread use. As-a-service quantum computing power would help enterprises access these capabilities at a fraction of the cost, in a similar way such models have helped businesses utilise simulation technology, high-performance computing and computer-aided engineering.

Alongside AI, the IoT, blockchain and automation, quantum computing is one of many digital tools likely to shape, streamline and optimise the future of the supply chain. As with all emerging technology, it requires an open mind and cautious optimism.

More here:
Supply Chain: The Quantum Computing Conundrum | Logistics - Supply Chain Digital - The Procurement & Supply Chain Platform

Quantum computers: This group wants to get them out of the lab and into your business – ZDNet

Five quantum computing companies, three universities and one national physical laboratory in the UK have come together in a 10 million ($13 million) new project, with an ambitious goal: to spend the next three years trying to make quantum technologies work for businesses.

Called Discovery, the program is partly funded by the UK government and has been pitched as the largest industry-led quantum computing project in the country to date. The participating organizations will dedicate themselves to making quantum technologies that are commercially viable, marking a shift from academic research to implementations that are relevant to, and scalable for, businesses.

The Discovery program will focus on photonic quantum computing, which is based on the manipulation of particles of light a branch of the field that has shown great promise but is still facing large technological barriers.

SEE: An IT pro's guide to robotic process automation (free PDF) (TechRepublic)

On the other hand, major players like IBM and Google are both developing quantum computers based on superconducting qubits made of electrons, which are particles of matter. The superconducting qubits found in those quantum devices are notoriously unstable, and require very cold temperatures to function, meaning that it is hard to increase the size of the computer without losing control of the qubits.

Photonic quantum computers, on the contrary, are less subject to interference in their environment, and would be much more practical to use and scale up. The field, however, is still in its infancy. For example, engineers are still working on ways to create the single quantum photons that are necessary for photonic quantum computers to function.

The companies that are a part of the Discovery program will be addressing this type of technical barrier over the next few years. They include photonics company M Squared, Oxford Ionics, ORCA Computing, Kelvin Nanotechnology and TMD Technologies.

"The Discovery project will help the UK establish itself at the forefront of commercially viable photonics-enabled quantum-computing approaches. It will enable industry to capitalize on the government's early investment into quantum technology and build on our strong academic heritage in photonics and quantum information," said Graeme Malcolm, CEO of M Squared.

Another key objective of the Discovery program will consist of developing the wider UK quantum ecosystem, by establishing commercial hardware supply and common roadmaps for the industry. This will be crucial to ensure that businesses are coordinating across the board when it comes to adopting quantum technologies.

Andrew Fearnside, senior associate specializing in quantum technologies at intellectual property firm Mewburn Ellis, told ZDNet: "We will need sources of hardware that all have the same required standards that everyone can comply with. This will enable everyone to speak the same language when building prototypes. Getting all the players to agree on a common methodology will make commercialization much easier."

Although quantum computers are yet to be used at a large commercial scale, the technology is expected to bring disruption in many if not all industries. Quantum devices will shake up artificial intelligence thanks to improved machine-learning models, solve optimization problems that are too large for classical computers to fathom, and boost new material discovery thanks to unprecedented simulation capabilities.

Finance, agriculture, drug discovery, oil and gas, or transportation are only a few of the many industries awaiting the revolution that quantum technology will bring about.

The UK is now halfway through a ten-year national program designed to boost quantum technologies, which is set to represent a 1 billion ($1.30 billion) investment over its lifetime.

SEE: Technology's next big challenge: To be fairer to everyone

The Discovery project comes under the umbrella of the wider national program; and according to Fearnside, it is reflective of a gradual shift in the balance of power between industry and academia.

"The national program has done a good job of enabling discussion between blue-sky researchers in university labs and industry," said Fearnside. "Blue-sky projects have now come to a point where you can think about pressing ahead and start commercializing. There is a much stronger focus on commercial partners playing a leading role, and the balance is shifting a little bit."

Last month, the UK government announced that US-based quantum computing company Rigetti would be building the country's first commercial quantum computer in Abingdon, Oxfordshire, and that partners and customers will be able to access and operate the system over the cloud. The move was similarly hailed as a step towards the commercialization of quantum technologies in the UK.

Although Fearnside acknowledged that there are still challenges ahead for quantum computing, not the least of which are technical, he expressed confidence that the technology will be finding commercial applications within the next decade.

Bridging between academia and industry, however, will require commitment from all players. Experts have previously warned that without renewed efforts from both sides, quantum ideas might well end up stuck in the lab.

Read more here:
Quantum computers: This group wants to get them out of the lab and into your business - ZDNet

Threat of Quantum Computing to Bitcoin Should be Taken Seriously, But theres Enough Time to Upgrade Current Security Systems, Experts Claim -…

LocalBitcoins, a leading peer to peer (P2P) Bitcoin exchange, notes that with the advent of quantum computing, there have been concerns that this new technology could be a threat to existing online protocols. Some experts claim that powerful quantum computers might become a legitimate threat to the security of Bitcoin (BTC) and the current encryption algorithms that it uses.

According to LocalBitcoins:

While the threat of quantum computing to Bitcoin is to be taken seriously, experts believe that Bitcoin [and other cryptocurrencies] have time to adapt to the quantum age without compromising [their] security in the process.

As explained in a blog post by LocalBitcoins, Bitcoin or BTC and its blockchain-based network is secured by cryptographic algorithms, which is why its called a cryptocurrency. Cryptography allows developers to protect certain sensitive data and communication on a platform so that only the parties authorized to view the information can access it. The LocalBitcoins team notes that cryptography uses several different algorithms, and Bitcoin depends on them to function properly.

At present, these algorithms are almost impossible to break, but quantum computers may spell trouble to these algorithms in various ways, according to LocalBitcoins.

They explain that the idea or concept behind quantum computing is to go beyond the power of traditional computers by leveraging quantum mechanics, a field in physics that describes behaviors on a subatomic scale. They also noted that when unobserved, subatomic particles can exist in multiple places at once, however, when [they have been] detected, they collapse into a single point in space-time.

They further explain:

Traditional computers operate with bits which encode either a 0 or a 1, while quantum computers use quantum bits, or qubits, which can be both a 0 or a 1 at the same time. This phenomenon is known as superposition which allows a huge amount of calculations to be carried out simultaneously.

They continued:

Bitcoins algorithm most at risk from quantum computing is its signature algorithm that uses ECDSA (Elliptic Curve Digital Signature Algorithm) [which] is used to generate the public/private key pair to sign Bitcoin transactions securely (sending and receiving coins). ECDSA uses asymmetric encryption, and the reason for it being secure comes from the need to factor multiple large prime numbers to break the algorithm. Breaking ECDSA and deriving a private key from a public key using current computers would take such an astronomical amount of time that it wouldnt even be realistic to try it out.

But with quantum computers that support parallel calculation, this same process can be carried out a lot more efficiently, and multiple types of attacks then become possible, the LocalBitcoins team noted.

They explained that the first one of these potential attacks aims to target re-used addresses. When a transaction is performed, your public key becomes visible on the blockchain or a distributed ledger technology (DLT) network. The LocalBitcoins team adds that knowing your public key, an attacker whos using quantum computers may then use your public key to derive your private key. After theyve determined what your private key might be, they can begin signing transactions on your behalf which means they can also spend your Bitcoins or any other cryptocurrency.

LocalBitcoins clarifies that addresses that have not been used to send transactions are quantum-safe because quantum computers cant read their public key.

LocalBitcoins further noted that another possible attack is the double-spend attack. This measures how fast a quantum computer can derive your private key from the already visible public key. They pointed out that if an attacker can do this before your transaction is confirmed multiple times in a block, you are essentially both trying to spend the same bitcoin, and the attacker wins.

They also mentioned:

Bitcoins hashing function used in the block creation is even more robust in the face of a quantum threat than its signature algorithm. The algorithm Bitcoin uses in its mining process is called SHA-256. When a miner solves a block and receives the right to add it to the blockchain, that miners transactions become confirmed, and part of the ledger.

They further explained:

To solve a block, a miner needs to guess a nonce, or a value that after a hash is applied, results in a number that has a certain number of leading zeroes. As a miner, you cant start from a valid result and then generate the correct nonce from it. You have to randomly guess it. This takes a lot of computing power and is behind the proof-of-work securing Bitcoins network. If the SHA-256 was broken somehow, an attacker could mine new blocks at will and earn all Bitcoin block rewards.

LocalBitcoins notes that existing quantum computers are only operated in labs and still appear to be a long way from becoming a legitimate threat to Bitcoin and other cryptocurrencies. According to estimates, a quantum computer of around 4000 qubits would be required to break Bitcoins code, while the most powerful quantum computers available right now operate with only about 50 qubits.

Industry experts predict that quantum computing machines may begin to break binary based encryption algorithms within the next decade unless theres an unexpected mathematical or physical breakthrough before that.

The LocalBitcoins team added:

When the quantum threat becomes more imminent, cryptography will have moved to more quantum-proof algorithms. In the process, Bitcoins algorithms would have become quantum-resistant as well. This can be achieved by hard-forking (backwards incompatible upgrade) the Bitcoin blockchain by consensus among the Bitcoin nodes, so it will be secure from quantum attacks.

They continued:

As long as multiple users have access to a quantum computer, no single entity will be able to gain dominance over Bitcoin mining. Perhaps in the future Bitcoins blockchain will be operated completely by nodes running on quantum computers.

Follow this link:
Threat of Quantum Computing to Bitcoin Should be Taken Seriously, But theres Enough Time to Upgrade Current Security Systems, Experts Claim -...

Four Teams Using ORNL’s Summit Supercomputer Named Finalists in 2020 Gordon Bell Prize – HPCwire

Nov. 11, 2020 Since 1987, the Association for Computing Machinery has awarded the annual Gordon Bell Prize to recognize outstanding achievements in high-performance computing (HPC). Presented each year at the International Conference for High-Performance Computing, Networking, Storage and Analysis (SC), the prizes not only reward innovative projects that employ HPC for applications in science, engineering, and large-scale data analytics but also provide a timeline of milestones in parallel computing.

As a frequent home to the worlds most powerful and smartest scientific supercomputers, the US Department of Energys (DOEs) Oak Ridge National Laboratory (ORNL) has hosted many previous Gordon Bell honorees on its HPC systems. The Oak Ridge Leadership Computing Facility (OLCF), a DOE Office of Science User Facility located at ORNL, manages these systems and makes them available to scientists around the world to accelerate scientific discovery and engineering progress. Consequently, the OLCF has provided the HPC systems for 25 previous Gordon Bell Prize finalists and eight winners, including last years team from ETH Zrich.

This year, four projects that used ORNLs IBM AC922 Summit supercomputer are finalists. The 2020 Gordon Bell Prize will be award November 19 at SC20. Here are the finalists that used Summit.

DeePMD-kit: A New Paradigm for Molecular Dynamics Modeling

The code produced by Team DeePMD, with its ability to scale to huge numbers of atoms, while retaining chemical accuracy, is poised to transform the field of materials research. Applications to other fields will surely follow. Michael Klein, Laura H. Camell Professor of Science, Temple University

Molecular dynamics modeling has become a primary tool in scientific inquiry, allowing scientists to analyze the movements of interacting atoms over a set period of time, which helps them determine the properties of different materials or organisms. These computer simulations often lead the way in designing everything from new drugs to improved alloys. However, the two most popular methodologies come with caveats.

Classical molecular dynamics (MD), using Newtonian physics, can simulate trillions of particles on a modern supercomputerhowever, its accuracy for more intricate simulations has limitations. Ab initio (from the beginning) molecular dynamics (AIMD), using quantum physics at each time step, can produce much more accurate resultsbut its inherent computational complexity limits the size and time span of its simulations. But what if there was a way to bridge the gap between MD and AIMD, to produce complex simulations that are both large and accurate?

With the power of ORNLs Summit supercomputer, researchers from Lawrence Berkeley National Laboratorys Computational Research Division; the University of California, Berkeley; the Institute of Applied Physics and Computational Mathematics, Peking University; and Princeton University successfully tested a software package that offers a potential solution: DeePMD-kit, named for deep potential molecular dynamics.

The team refers to DeePMD-kit as a HPC+AI+Physical model in that it combines high-performance computing (HPC), artificial intelligence (AI), and physical principles to achieve both speed and accuracy. It uses a neural network to assist its calculations by approximating the ab initio data, thereby reducing the computational complexity from cubic to linear scaling.

Simulating a block of copper atoms, the team put DeePMD-kit to the test on Summit with the goal of seeing how far they could push the simulations size and timescales beyond AIMDs accepted limitations. They were able to simulate a system of 127.4 million atomsmore than 100 times larger than the current state of the art. Furthermore, the simulation achieved a time-to-solution mark of at least 1,000 times faster at 2.5 nanoseconds per day for mixed-half precision, with a peak performance of 275 petaflops (one thousand million million floating-point operations per second) for mixed-half precision.

By combining physical principles and the representation power of deep neural networks, the Deep Potential method can achieve very good accuracy, especially for complex problems, said Weile Jia, a postdoc in applied mathematics in Professor Lin Lins group at the Math Department of UC Berkeley, who co-led the project with Linfeng Zhang of Princeton. Then we reorganize the data layout for bigger granularity on GPU and use data compression to significantly speedup the bottleneck. The neural network operators are optimized to the extreme, and most importantly, we successfully use half-precision in our code without losing accuracy.

Square Kilometre Array: Massive Data Processing to Explore the Universe

The innovative results already achieved and goals being pursued by this international team will greatly benefit the Next Generation Very Large Array, the Square Kilometre Array, and the next generation of radio interferometer facilities around the world. Tony Beasley, Director, National Radio Astronomy Observatory

Scheduled to begin construction in 2021, the Square Kilometre Array (SKA) promises to become one of the biggest Big Science projects of all time (in physical size): a radio telescope array with a combined collecting area of over 1 square kilometer, or 1 million square meters. Once completed in the deserts of South Africa and Australia in the late 2020s, SKAs thousands of dishes and low-frequency antennas will plumb the universe to figure out its mysteries.

SKAs mission ultimately means it will produce massive amounts of informationan estimated 600 petabytes of data per year. Collecting, storing, and analyzing that data will be critical in producing SKAs scientific discoveries. How will it be managed?

Building an end-to-end data-processing system on such an unprecedented scale is the task of an international team of radio astronomers, computer scientists, and software engineers. Workflow experts from the International Centre for Radio Astronomy Research (ICRAR) in Australia and the Shanghai Astronomical Observatory (SHAO) in China are developing the Daliuge workflow management system; GPU experts from Oxford University are optimizing the performance of the data generator; and input/output (I/O) experts at ORNL are producing I/O middleware based on the ORNL-developed Adaptable IO System (ADIOS). These three core software packages were completely developed by the team, with the original scope of running on top supercomputers.

Because SKA does not yet exist, its huge data output was simulated on Summit in order to test the teams work, running a complete end-to-end workflow for a typical 6-hour SKA Phase 1 Low Frequency Array observation. The team used 99 percent of Summit, achieving 130 petaflops peak performance for single-precision, 247 gigabytes per second data generation rate, and 925 gigabytes per second pure I/O rate.

For the first time, an end-to-end SKA data-processing workflow was executed in a production environment. It helps the SKA communityas well as the entire radio astronomy communitydetermine critical design factors for multi-billion-dollar next generation radio telescopes, said Ruonan Wang, a software engineer in ORNLs Scientific Data Group who works on the project. It validated our ability, from both software and hardware perspectives, to process a key science case of SKA, which will answer some of the fundamental questions of our universe.

DSNAPSHOT: An Accelerated Approach to Literature-Based Discovery

The DSNAPSHOT algorithm approach enables the identification of meaningful paths and novel relations on a previously unseen scale. Consequently, it moves the biomedical research community closer to a framework for analyzing how novel relations can be identified across the entire body of scientific literature. Michael Weiner, PhD, VP AxioMx, Molecular Sciences and Head, Global Research of Abcam

In 1986, the late information scientist Don Swanson introduced the concept of undiscovered public knowledge in the field of biomedical research. His idea was both intriguing and straightforward: Out of the millions of published pieces of medical literature, what if there are yet unseen connections between their findings that could lead to new treatments? If, for example, A affects B in one study and B affects C in another, perhaps A and C have undiscovered commonalities worth investigating. Swanson proved his point by analyzing unrelated papers for such links, leading to hypothetical treatments that were later supported by clinical studies, such as taking magnesium supplements to help prevent migraine headaches. This process became known as Swanson Linking.

But in light of the enormous size of scientific literature in existence, mining it for undiscovered connections cannot be effectively conducted on a mass scale by mere humans. For example, the US National Library of Medicines PubMed database contains over 30 million citations and abstracts for biomedical literature. How can researchers possibly track that much information in its totality and find the patterns that may help identify new treatments?

One answer may be data-mining algorithms optimized for GPU-accelerated supercomputers such as ORNLs Summit. When the federal government mobilized its national labs in the fight against COVID-19 in March, a team of ORNL and Georgia Tech researchers was assembled by ORNL computer scientist Ramakrishnan Kannan and Thomas E. Potok, head of ORNLs Data and AI Systems Section of the Computer Science and Mathematics Division. The teams mission was to investigate new ways of searching large-scale bodies of scholarly literatureand they ultimately found a way to conduct Swanson Linking on huge datasets at unprecedented speed.

Dasha Herrmannova from Kannans team began by creating a graph dataset based on Semantic MEDLINEa dataset of biomedical concepts and the relations between themextracted from PubMed. Then they expanded the graph with information extracted from the COVID-19 Open Research Dataset (CORD-19), resulting in a dataset of 18.5 million nodes representing concepts and papers, with 213 million relationships between them.

To search this massive dataset (via knowledge graph representations) for potential COVID-19 treatments, the team developed a new high-performance implementation of the Floyd-Warshall algorithm. The classic algorithm, originally published in 1962, determines the shortest distances between every pair of vertices in a given graph. (In terms of literature-based discovery, the shortest paths are usually more likely to reveal new connections between scholarly works.) Wanting to overcome the computational bottleneck of tackling massive graphs, Kannan, Piyush Sao, Hao Lu, and Robert Patton from ORNL, in collaboration with Vijay Thakkar and Rich Vuduc from Georgia Tech, optimized their version of the algorithm for distributed-memory parallel computers accelerated by GPUs. They named it Distributed Accelerated Semiring All-Pairs Shortest Path (DSNAPSHOT).

In effect, the teams DSNAPSHOT is a supercharged version of Floyd-Warshall, able to identify the shortest paths in huge graphs in a matter of minutes. Using 90 percent of the Summit supercomputeror 4,096 nodes, adding up to 24,576 GPUsthe team was able to compute an All-Pairs Shortest Path computation on a graph with 4.43 million vertices in 21.3 minutes. Peak performance reached 136 petaflops for single-precision. If every person on Earth completed one calculation per second, it would take the worlds population (~7 billion) 7 and a half months to complete what DSNAPSHOT can do in 1 second on Summit.

To the best of our knowledge, DSNAPSHOT is the first method capable of calculating shortest path between all pairs of entities in a biomedical knowledge graph, thereby enabling the discovery of meaningful relations across the whole of biomedical knowledge, Kannan said. Looking forward, we believe this novel capability will enable the mining of scholarly knowledge corpora when embedded and integrated into artificial intelligencedriven natural language processing workflows at scale.

BerkeleyGW: A New View into Excited-State Electrons

The BerkeleyGW teams demonstration of excited-state calculations with the GW method for 1,000-atom systems on accessible HPC facilities will be a game-changer. Researchers with diverse interests will be able to pursue fundamental understanding of excited states and physical processes in materials systems including novel two-dimensional semiconductors, electrochemical interfaces, organic molecular energy harvesting systems, and materials proposed for quantum information systems. Mark S. Hybertsen, Group Leader, Theory & Computation Group Center for Functional Nanomaterials, Brookhaven National Laboratory

Historical epochs are often delineated by the materials that helped shape civilization, from the Stone Age to the Steel Age. Our current period is often referred to as the Silicon Agebut while those earlier eras were characterized by the structural properties of their predominant materials, silicon is different. Rather than ushering in new ways of building big things, its technological leap takes place on an atomic level, facilitating an information revolution.

Used as the main material in integrated circuits (AKA, the microchip), silicon has enabled the world of data processing we currently live in, from ever-more-powerful computers to unavoidable handheld devices. Central to its success has been the ability of chip designers to engineer these circuits to be increasingly faster and smaller, yet with more capacity as they add more and more transistors. But can microprocessor architects keep up with Moores law and continue to double the number of transistors in an integrated circuit every 2 years?

One route forward may be found in the work of a team of six physicists, materials scientists, and HPC specialists from the Berkeley Lab, UC Berkeley, and Stanford University that performed the largest-ever study of excited-state electrons using ORNLs Summit supercomputer. Understanding and controlling such electronic excitation in silicon and other materials is key to designing the electronic and optoelectronic devices that have sparked the current information era. Whats more, the accurate modeling of excited-state properties of electrons in materials plays a crucial role in the rational design of other transformative technologies, including photovoltaics, batteries, and qubits for quantum information and quantum computing. In essence, the teams high-performance calculations could help design new materials for these next generation technologies.

A state-of-the-art tool for determining excitations in materials is the GW method, an approach for calculating the self-energy (the quantum energy that a particle acquired from interactions with its surrounding environment) of a system of interacting electrons. The team adapted its own software package: BerkeleyGWa quantum many-body perturbation theory code for excited statesto run on Summits GPU accelerators.

The teams study of a system of defects in silicon and silicon carbide resulted in groundbreaking performance: the largest high-fidelity GW calculations ever made, with 10,986 valence electrons. By running on the entire Summit supercomputer, they also achieved 105.9 petaflops of double-precision performance with a time to solution of roughly 10 minutes.

Whats really exciting about these numbers is that together they usher in the practical use of the high-fidelity GW method to the study of realistic complex materials, said Jack Deslippe, team leader and head of the Applications Performance Group at the National Energy Research Scientific Computing Center, or NERSC. These will be materials with defects, with interfaces, and with large geometries that drive real device design in quantum information, energy generation and storage, and next-gen electronics.

UT-Battelle LLC manages Oak Ridge National Laboratory for DOEs Office of Science, the single largest supporter of basic research in the physical sciences in the United States. DOEs Office of Science is working to address some of the most pressing challenges of our time. For more information, visit https://energy.gov/science.

More info: https://www.olcf.ornl.gov/2020/11/10/four-teams-using-ornls-summit-supercomputer-named-finalists-in-2020-gordon-bell-prize/

Source: COURY TURCZYN, ORNL

Original post:
Four Teams Using ORNL's Summit Supercomputer Named Finalists in 2020 Gordon Bell Prize - HPCwire

Quantum computers are coming. Get ready for them to change everything – ZDNet

Supermarket aisles filled with fresh produce are probably not where you would expect to discover some of the first benefits of quantum computing.

But Canadian grocery chain Save-On-Foods has become an unlikely pioneer, using quantum technology to improve the management of in-store logistics. In collaboration with quantum computing company D-Wave, Save-On-Foods is using a new type of computing, which is based on the downright weird behaviour of matter at the quantum level. And it's already seeing promising results.

The company's engineers approached D-Wave with a logistics problem that classical computers were incapable of solving. Within two months, the concept had translated into a hybrid quantum algorithm that was running in one of the supermarket stores, reducing the computing time for some tasks from 25 hours per week down to mere seconds.

SEE: Guide to Becoming a Digital Transformation Champion (TechRepublic Premium)

Save-On-Foods is now looking at expanding the technology to other stores, and exploring new ways that quantum could help with other issues. "We now have the capability to run tests and simulations by adjusting variables and see the results, so we can optimize performance, which simply isn't feasible using traditional methods," a Save-On-Foods spokesperson tells ZDNet.

"While the results are outstanding, the two most important things from this are that we were able to use quantum computing to attack our most complex problems across the organization, and can do it on an ongoing basis."

The remarkable properties of quantum computing boil down to the behaviour of qubits -- the quantum equivalent of classical bits that encode information for today's computers in strings of 0s and 1s. But contrary to bits, which can be represented by either 0 or 1, qubits can take on a state that is quantum-specific, in which they exist as 0 and 1 in parallel, or superposition.

Qubits, therefore, enable quantum algorithms to run various calculations at the same time, and at exponential scale: the more qubits, the more variables can be explored, and all in parallel. Some of the largest problems, which would take classical computers tens of thousands of years to explore with single-state bits, could be harnessed by qubits in minutes.

The challenge lies in building quantum computers that contain enough qubits for useful calculations to be carried out. Qubits are temperamental: they are error-prone, hard to control, and always on the verge of falling out of their quantum state. Typically, scientists have to encase quantum computers in extremely cold, large-scale refrigerators, just to make sure that qubits remain stable. That's impractical, to say the least.

This is, in essence, why quantum computing is still in its infancy. Most quantum computers currently work with less than 100 qubits, and tech giants such as IBM and Google are racing to increase that number in order to build a meaningful quantum computer as early as possible. Recently, IBM ambitiously unveiled a roadmap to a million-qubit system, and said that it expects a fault-tolerant quantum computer to be an achievable goal during the next ten years.

IBM's CEO Arvind Krishna and director of research Dario Gil in front of a ten-foot-tall super-fridge for the company's next-generation quantum computers.

Although it's early days for quantum computing, there is still plenty of interest from businesses willing to experiment with what could prove to be a significant development. "Multiple companies are conducting learning experiments to help quantum computing move from the experimentation phase to commercial use at scale," Ivan Ostojic, partner at consultant McKinsey, tells ZDNet.

Certainly tech companies are racing to be seen as early leaders. IBM's Q Network started running in 2016 to provide developers and industry professionals with access to the company's quantum processors, the latest of which, a 65-qubit device called Hummingbird, was released on the platform last month. Recently, US multinational Honeywell took its first steps on the quantum stage, making the company's trapped-ion quantum computer available to customers over the cloud. Rigetti Computing, which has been operating since 2017, is also providing cloud-based access to a 31-qubit quantum computer.

Another approach, called quantum annealing, is especially suitable for optimisation tasks such as the logistics problems faced by Save-On-Foods. D-Wave has proven a popular choice in this field, and has offered a quantum annealer over the cloud since 2010, which it has now upgraded to a 5,000-qubit-strong processor.

A quantum annealing processor is much easier to control and operate than the devices that IBM, Honeywell and Rigetti are working on, which are called gate-model quantum computers. This is why D-Wave's team has already hit much higher numbers of qubits. However, quantum annealing is only suited to specific optimisation problems, and experts argue that the technology will be comparatively limited when gate-model quantum computers reach maturity.

The suppliers of quantum processing power are increasingly surrounded by third-party companies that act as intermediaries with customers. Zapata, QC Ware or 1QBit, for example, provide tools ranging from software stacks to training, to help business leaders get started with quantum experiments.

SEE: What is the quantum internet? Everything you need to know about the weird future of quantum networks

In other words, the quantum ecosystem is buzzing with activity, and is growing fast. "Companies in the industries where quantum will have the greatest potential for complete disruption should get involved in quantum right now," says Ostojic.

And the exponential compute power of quantum technologies, according to the analyst, will be a game-changer in many fields. Qubits, with their unprecedented ability to solve optimisation problems, will benefit any organisation with a supply chain and distribution route, while shaking up the finance industry by maximising gains from portfolios. Quantum-infused artificial intelligence also holds huge promise, with models expected to benefit from better training on bigger datasets.

One example: by simulating molecular interactions that are too complex for classical computers to handle, qubits will let biotech companies fast-track the discovery of new drugs and materials. Microsoft, for example, has already demonstrated how quantum computers can help manufacture fertilizers with better yields. This could have huge implications for the agricultural sector, as it faces the colossal task of sustainably feeding the growing global population in years to come.

Chemistry, oil and gas, transportation, logistics, banking and cybersecurity are often cited as sectors that quantum technology could significantly transform. "In principle, quantum will be relevant for all CIOs as it will accelerate solutions to a large range of problems," says Ostojic. "Those companies need to become owners of quantum capability."

Chemistry, oil and gas, transportation, logistics, banking or cybersecurity are among the industries that are often pointed to as examples of the fields that quantum technology could transform.

There is a caveat. No CIO should expect to achieve too much short-term value from quantum computing in its current form. However fast-growing the quantum industry is, the field remains defined by the stubborn instability of qubits, which still significantly limits the capability of quantum computers.

"Right now, there is no problem that a quantum computer can solve faster than a classical computer, which is of value to a CIO," insists Heike Riel, head of science and technology at IBM Research Quantum Europe. "But you have to be very careful, because the technology is evolving fast. Suddenly, there might be enough qubits to solve a problem that is of high value to a business with a quantum computer."

And when that day comes, there will be a divide between the companies that prepared for quantum compute power, and those that did not. This is what's at stake for business leaders who are already playing around with quantum, explains Riel. Although no CIO expects quantum to deliver value for the next five to ten years, the most forward-thinking businesses are already anticipating the wave of innovation that the technology will bring about eventually -- so that when it does, they will be the first to benefit from it.

This means planning staffing, skills and projects, and building an understanding of how quantum computing can help solve actual business problems. "This is where a lot of work is going on in different industries, to figure out what the true problems are, which can be solved with a quantum computer and not a classical computer, and which would make a big difference in terms of value," says Riel.

Riel points to the example of quantum simulation for battery development, which companies like car manufacturer Daimler are investigating in partnership with IBM. To increase the capacity and speed-of-charging of batteries for electric vehicles, Daimler's researchers are working on next-generation lithium-sulfur batteries, which require the alignment of various compounds in the most stable configuration possible. To find the best placement of molecules, all the possible interactions between the particles that make up the compound's molecules must be simulated.

This task can be carried out by current supercomputers for simple molecules, but a large-scale quantum solution could one day break new ground in developing the more complex compounds that are required for better batteries.

"Of course, right now the molecules we are simulating with quantum are small in size because of the limited size of the quantum computer," says Riel. "But when we scale the next generation of quantum computers, then we can solve the problem despite the complexity of the molecules."

SEE: 10 tech predictions that could mean huge changes ahead

Similar thinking led oil and gas giant ExxonMobilto join the network of companies that are currently using IBM's cloud-based quantum processors. ExxonMobil started collaborating with IBM in 2019, with the objective of one day using quantum to design new chemicals for low energy processing and carbon capture.

The company's director of corporate strategic research Amy Herhold explains that for the past year, ExxonMobil's scientists have been tapping IBM's quantum capabilities to simulate macroscopic material properties such as heat capacity. The team has focused so far on the smallest of molecules, hydrogen gas, and is now working on ways to scale the method up to larger molecules as the hardware evolves.

A number of milestones still need to be achieved before quantum computing translates into an observable business impact, according to Herhold. Companies will need to have access to much larger quantum computers with low error rates, as well as to appropriate quantum algorithms that address key problems.

"While today's quantum computers cannot solve business-relevant problems -- they are too small and the qubits are too noisy -- the field is rapidly advancing," Herhold tells ZDNet. "We know that research and development is critical on both the hardware and the algorithm front, and given how different this is from classical computing, we knew it would take time to build up our internal capabilities. This is why we decided to get going."

Herhold anticipates that quantum hardware will grow at a fast pace in the next five years. The message is clear: when it does, ExxonMobil's research team will be ready.

One industry that has shown an eager interest in quantum technology is the financial sector. From JP Morgan Chase's partnerships with IBM and Honeywell, to BBVA's use of Zapata's services, banks are actively exploring the potential of qubits, and with good reason. Quantum computers, by accounting for exponentially high numbers of factors and variables, could generate much better predictions of financial risk and uncertainty, and boost the efficiency of key operations such as investment portfolio optimisation or options pricing.

Similar to other fields, most of the research is dedicated to exploring proof-of-concepts for the financial industry. In fact, when solving smaller problems, scientists still run quantum algorithms alongside classical computers to validate the results.

"The classical simulator has an exact answer, so you can check if you're getting this exact answer with the quantum computer," explains Tony Uttley, president of Honeywell Quantum Solutions, as he describes the process of quantum options pricing in finance.

"And you better be, because as soon as we cross that boundary, where we won't be able to classically simulate anymore, you better be convinced that your quantum computer is giving you the right answer. Because that's what you'll be taking into your business processes."

Companies that are currently working on quantum solutions are focusing on what Uttley calls the "path to value creation". In other words, they are using quantum capabilities as they stand to run small-scale problems, building trust in the technology as they do so, while they wait for capabilities to grow and enable bigger problems to be solved.

In many fields, most of the research is dedicated to exploring proof-of-concepts for quantum computing in industry.

Tempting as it might be for CIOs to hope for short-term value from quantum services, it's much more realistic to look at longer timescales, maintains Uttley. "Imagine you have a hammer, and somebody tells you they want to build a university campus with it," he says. "Well, looking at your hammer, you should ask yourself how long it's going to take to build that."

Quantum computing holds the promise that the hammer might, in the next few years, evolve into a drill and then a tower crane. The challenge, for CIOs, is to plan now for the time that the tools at their disposal get the dramatic boost that's expected by scientists and industry players alike.

It is hard to tell exactly when that boost will come. IBM's roadmap announces that the company will reach 1,000 qubits in 2023, which could mark the start of early value creation in pharmaceuticals and chemicals, thanks to the simulation of small molecules. But although the exact timeline is uncertain, Uttley is adamant that it's never too early to get involved.

"Companies that are forward-leaning already have teams focused on this and preparing their organisations to take advantage of it once we cross the threshold to value creation," he says. "So what I tend to say is: engage now. The capacity is scarce, and if you're not already at the front of the line, it may be quite a while before you get in."

Creating business value is a priority for every CIO. At the same time, the barrier to entry for quantum computing is lowering every time a new startup emerges to simplify the software infrastructure and assist non-experts in kickstarting their use of the technology. So there's no time to lose in embracing the technology. Securing a first-class spot in the quantum revolution, when it comes, is likely to be worth it.

Read this article:
Quantum computers are coming. Get ready for them to change everything - ZDNet