Archive for the ‘Quantum Computing’ Category

Quantum Computing in Manufacturing Market Still Has Room To Grow: International Business Machines, D-Wave Systems, Microsoft – Digital Journal

Latest Report Available at Advance Market Analytics, Quantum Computing in Manufacturing Market provides pin-point analysis for changing competitive dynamics and a forward looking perspective on different factors driving or restraining industry growth.

The global Quantum Computing in Manufacturing market focuses on encompassing major statistical evidence for the Quantum Computing in Manufacturing industry as it offers our readers a value addition on guiding them in encountering the obstacles surrounding the market. A comprehensive addition of several factors such as global distribution, manufacturers, market size, and market factors that affect the global contributions are reported in the study. In addition the Quantum Computing in Manufacturing study also shifts its attention with an in-depth competitive landscape, defined growth opportunities, market share coupled with product type and applications, key companies responsible for the production, and utilized strategies are also marked.

Key players in the global Quantum Computing in Manufacturing market:

International Business Machines (United States), D-Wave Systems (Canada), Microsoft (United States), Amazon (United States), Rigetti Computing (United States), Google (United States), Intel (United States), Honeywell International (United States), Quantum Circuits (United States), QC Ware (United States), Atom Computing, Inc. (United States), Xanadu Quantum Technologies Inc. (Canada), Zapata Computing, Inc. (United States), Strangeworks, Inc (United States)

Free Sample Report + All Related Graphs & Charts @:https://www.advancemarketanalytics.com/sample-report/179263-global-quantum-computing-in-manufacturing-market

Quantum computing is the computing technique that uses the collective resource of quantum states, Some of them main resources are superposition and entanglement, to perform computation. As these are able to execute quantum computations that is why it also called quantum computers. Quantum computing harnesses the phenomena of quantum mechanics to deliver a huge leap forward in computation to solve certain problems. Quantum computing is an area of study focused on the development of computer-based technologies centered on the principles of quantum theory.

On 12 February 2021 To further progress into the quantum age, various projects are in the works to take computing to the next level. After forming a consortium in December, EU stakeholders have launched an effort to supercharge quantum processor production.

Whats Trending in Market?

Integration With Advance Technologies

What are the Market Drivers?

Raising Deposal Income

The Global Quantum Computing in Manufacturing Market segments and Market Data Break Down are illuminated below:

by Application (Simulation & Testing, Financial Modeling, Artificial Intelligence & Machine Learning, Cybersecurity & Cryptography, Other), Component (Quantum Computing Devices, Quantum Computing Software, Quantum Computing Services)

The study encompasses a variety of analytical resources such as SWOT analysis and Porters Five Forces analysis coupled with primary and secondary research methodologies. It covers all the bases surrounding the Quantum Computing in Manufacturing industry as it explores the competitive nature of the market complete with a regional analysis.

Have Any Questions Regarding Global Quantum Computing in Manufacturing Market Report, Ask Our [emailprotected]https://www.advancemarketanalytics.com/enquiry-before-buy/179263-global-quantum-computing-in-manufacturing-market

The Quantum Computing in Manufacturing industry report further exhibits a pattern of analyzing previous data sources gathered from reliable sources and sets a precedent growth trajectory for the Quantum Computing in Manufacturing market. The report also focuses on a comprehensive market revenue streams along with growth patterns, Local reforms, COVID Impact analysis with focused approach on market trends, and the overall growth of the market.

Moreover, the Quantum Computing in Manufacturing report describes the market division based on various parameters and attributes that are based on geographical distribution, product types, applications, etc. The market segmentation clarifies further regional distribution for the Quantum Computing in Manufacturing market, business trends, potential revenue sources, and upcoming market opportunities.

The Quantum Computing in Manufacturing market study further highlights the segmentation of the Quantum Computing in Manufacturing industry on a global distribution. The report focuses on regions of LATAM, North America, Europe, Asia, and the Rest of the World in terms of developing market trends, preferred marketing channels, investment feasibility, long term investments, and business environmental analysis. The Quantum Computing in Manufacturing report also calls attention to investigate product capacity, product price, profit streams, supply to demand ratio, production and market growth rate, and a projected growth forecast.

Read Detailed Index of full Research Study at @https://www.advancemarketanalytics.com/reports/179263-global-quantum-computing-in-manufacturing-market

In addition, the Quantum Computing in Manufacturing market study also covers several factors such as market status, key market trends, growth forecast, and growth opportunities. Furthermore, we analyze the challenges faced by the Quantum Computing in Manufacturing market in terms of global and regional basis. The study also encompasses a number of opportunities and emerging trends which are considered by considering their impact on the global scale in acquiring a majority of the market share.

Some Point of Table of Content:Chapter One: Report OverviewChapter Two: Global Market Growth TrendsChapter Three: Value Chain of Quantum Computing in Manufacturing MarketChapter Four: Players ProfilesChapter Five: Global Quantum Computing in Manufacturing Market Analysis by RegionsChapter Six: North America Quantum Computing in Manufacturing Market Analysis by CountriesChapter Seven: Europe Quantum Computing in Manufacturing Market Analysis by CountriesChapter Eight: Asia-Pacific Quantum Computing in Manufacturing Market Analysis by CountriesChapter Nine: Middle East and Africa Quantum Computing in Manufacturing Market Analysis by CountriesChapter Ten: South America Quantum Computing in Manufacturing Market Analysis by CountriesChapter Eleven: Global Quantum Computing in Manufacturing Market Segment by TypesChapter Twelve: Global Quantum Computing in Manufacturing Market Segment by Applications

Buy This Exclusive Research Here: https://www.advancemarketanalytics.com/buy-now?format=1&report=179263

Thanks for reading this article; you can also get individual chapter wise section or region wise report version like North America, West Europe or Southeast Asia.

Contact US:Craig Francis (PR & Marketing Manager)AMA Research & Media LLPUnit No. 429, Parsonage Road Edison, NJNew Jersey USA 08837Phone: +1 (206) 317 1218[emailprotected]

Read the original post:
Quantum Computing in Manufacturing Market Still Has Room To Grow: International Business Machines, D-Wave Systems, Microsoft - Digital Journal

Connecting the Dots Between Material Properties and Superconducting Qubit Performance – SciTechDaily

Scientists performed transmission electron microscopy and x-ray photoelectron spectroscopy (XPS) at Brookhaven Labs Center for Functional Nanomaterials and National Synchrotron Light Source II to characterize the properties of niobium thin films made into superconducting qubit devices at Princeton University. A transmission electron microscope image of one of these films is shown in the background; overlaid on this image are XPS spectra (colored lines representing the relative concentrations of niobium metal and various niobium oxides as a function of film depth) and an illustration of a qubit device. Through these and other microscopy and spectroscopy studies, the team identified atomic-scale structural and surface chemistry defects that may be causing loss of quantum informationa hurdle to enabling practical quantum computers. Credit: Brookhaven National Laboratory

Brookhaven Lab and Princeton scientists team up to identify sources of loss of quantum information at the atomic scale.

Engineers and materials scientists studying superconducting quantum information bits (qubits)a leading quantum computing material platform based on the frictionless flow of paired electronshave collected clues hinting at the microscopic sources of qubit information loss. This loss is one of the major obstacles in realizing quantum computers capable of stringing together millions of qubits to run demanding computations. Such large-scale, fault-tolerant systems could simulate complicated molecules for drug development, accelerate the discovery of new materials for clean energy, and perform other tasks that would be impossible or take an impractical amount of time (millions of years) for todays most powerful supercomputers.

An understanding of the nature of atomic-scale defects that contribute to qubit information loss is still largely lacking. The team helped bridge this gap between material properties and qubit performance by using state-of-the-art characterization capabilities at the Center for Functional Nanomaterials (CFN) and National Synchrotron Light Source II (NSLS-II), both U.S. Department of Energy (DOE) Office of Science User Facilities at Brookhaven National Laboratory. Their results pinpointed structural and surface chemistry defects in superconducting niobium qubits that may be causing loss.

Anjali Premkumar

Superconducting qubits are a promising quantum computing platform because we can engineer their properties and make them using the same tools used to make regular computers, said Anjali Premkumar, a fourth-year graduate student in the Houck Lab at Princeton University and first author on the Communications Materials paper describing the research. However, they have shorter coherence times than other platforms.

In other words, they cant hold onto information very long before they lose it. Though coherence times have recently improved from microseconds to milliseconds for single qubits, these times significantly decrease when multiple qubits are strung together.

Qubit coherence is limited by the quality of the superconductors and the oxides that will inevitably grow on them as the metal comes into contact with oxygen in the air, continued Premkumar. But, as qubit engineers, we havent characterized our materials in great depth. Here, for the first time, we collaborated with materials experts who can carefully look at the structure and chemistry of our materials with sophisticated tools.

This collaboration was a prequel to the Co-design Center for Quantum Advantage (C2QA), one of five National Quantum Information Science Centers established in 2020 in support of the National Quantum Initiative. Led by Brookhaven Lab, C2QA brings together hardware and software engineers, physicists, materials scientists, theorists, and other experts across national labs, universities, and industry to resolve performance issues with quantum hardware and software. Through materials, devices, and software co-design efforts, the C2QA team seeks to understand and ultimately control material properties to extend coherence times, design devices to generate more robust qubits, optimize algorithms to target specific scientific applications, and develop error-correction solutions.

Andrew Houck

In this study, the team fabricated thin films of niobium metal through three different sputtering techniques. In sputtering, energetic particles are fired at a target containing the desired material; atoms are ejected from the target material and land on a nearby substrate. Members of the Houck Lab performed standard (direct current) sputtering, while Angstrom Engineering applied a new form of sputtering they specialize in (high-power impulse magnetron sputtering, or HiPIMS), where the target is struck with short bursts of high-voltage energy. Angstrom carried out two variations of HiPIMS: normal and with an optimized power and target-substrate geometry.

Back at Princeton, Premkumar made transmon qubit devices from the three sputtered films and placed them in a dilution refrigerator. Inside this refrigerator, temperatures can plunge to near absolute zero (minus 459.67 degrees Fahrenheit), turning qubits superconducting. In these devices, superconducting pairs of electrons tunnel across an insulating barrier of aluminum oxide (Josephson junction) sandwiched between superconducting aluminum layers, which are coupled to capacitor pads of niobium on sapphire. The qubit state changes as the electron pairs go from one side of the barrier to the other. Transmon qubits, co-invented by Houck Lab principal investigator and C2QA Director Andrew Houck, are a leading kind of superconducting qubit because they are highly insensitive to fluctuations in electric and magnetic fields in the surrounding environment; such fluctuations can cause qubit information loss.

For each of the three device types, Premkumar measured the energy relaxation time, a quantity related to the robustness of the qubit state.

The energy relaxation time corresponds to how long the qubit stays in the first excited state and encodes information before it decays to the ground state and loses its information, explained Ignace Jarrige, formerly a physicist at NSLS-II and now a quantum research scientist at Amazon, who led the Brookhaven team for this study.

Ignace Jarrige

Each device had different relaxation times. To understand these differences, the team performed microscopy and spectroscopy at the CFN and NSLS-II.

NSLS-II beamline scientists determined the oxidation states of niobium through x-ray photoemission spectroscopy with soft x-rays at the In situ and Operando Soft X-ray Spectroscopy (IOS) beamline and hard x-rays at the Spectroscopy Soft and Tender (SST-2) beamline. Through these spectroscopy studies, they identified various suboxides located between the metal and the surface oxide layer and containing a smaller amount of oxygen relative to niobium.

We needed the high energy resolution at NSLS-II to distinguish the five different oxidation states of niobium and both hard and soft x-rays, which have different energy levels, to profile these states as a function of depth, explained Jarrige. Photoelectrons generated by soft x-rays only escape from the first few nanometers of the surface, while those generated by hard x-rays can escape from deeper in the films.

At the NSLS-II Soft Inelastic X-ray Scattering (SIX) beamline, the team identified spots with missing oxygen atoms through resonant inelastic x-ray scattering (RIXS). Such oxygen vacancies are defects, which can absorb energy from qubits.

At the CFN, the team visualized film morphology using transmission electron microscopy and atomic force microscopy, and characterized the local chemical makeup near the film surface through electron energy-loss spectroscopy.

Sooyeon Hwang

The microscope images showed grainspieces of individual crystals with atoms arranged in the same orientationsized larger or smaller depending on the sputtering technique, explained coauthor Sooyeon Hwang, a staff scientist in the CFN Electron Microscopy Group. The smaller the grains, the more grain boundaries, or interfaces where different crystal orientations meet. According to the electron energy-loss spectra, one film had not just oxides on the surface but also in the film itself, with oxygen diffused into the grain boundaries.

Their experimental findings at the CFN and NSLS-II revealed correlations between qubit relaxation times and the number and width of grain boundaries and concentration of suboxides near the surface.

Grain boundaries are defects that can dissipate energy, so having too many of them can affect electron transport and thus the ability of qubits to perform computations, said Premkumar. Oxide quality is another potentially important parameter. Suboxides are bad because electrons are not happily paired together.

Going forward, the team will continue their partnership to understand qubit coherence through C2QA. One research direction is to explore whether relaxation times can be improved by optimizing fabrication processes to generate films with larger grain sizes (i.e., minimal grain boundaries) and a single oxidation state. They will also explore other superconductors, including tantalum, whose surface oxides are known to be more chemically uniform.

From this study, we now have a blueprint for how scientists who make qubits and scientists who characterize them can collaborate to understand the microscopic mechanisms limiting qubit performance, said Premkumar. We hope other groups will leverage our collaborative approach to drive the field of superconducting qubits forward.

Reference: Microscopic relaxation channels in materials for superconducting qubits by Anjali Premkumar, Conan Weiland, Sooyeon Hwang, Berthold Jck, Alexander P. M. Place, Iradwikanari Waluyo, Adrian Hunt, Valentina Bisogni, Jonathan Pelliciari, Andi Barbour, Mike S. Miller, Paola Russo, Fernando Camino, Kim Kisslinger, Xiao Tong, Mark S. Hybertsen, Andrew A. Houck and Ignace Jarrige, 1 July 2021, Communications Materials.DOI: 10.1038/s43246-021-00174-7

This work was supported by the DOE Office of Science, National Science Foundation Graduate Research Fellowship, Humboldt Foundation, National Defense Science and Engineering Graduate Fellowship, Materials Research Science and Engineering Center, and Army Research Office. This research used resources of the Electron Microscopy, Proximal Probes, and Theory and Computation Facilities at the CFN, a DOE Nanoscale Science Research Center. The SST-2 beamline at NSLS-II is operated by the National Institute of Standards and Technology.

More:
Connecting the Dots Between Material Properties and Superconducting Qubit Performance - SciTechDaily

IBM CEO: Quantum computing will take off ‘like a rocket ship’ this decade – Fast Company

IBM is one of the best-known names in technology, and yet most people would struggle to explain exactly what the company does. Thats understandable. The Armonk, New York-based tech mainstay has often reinvented itself, and is in the process of becoming a hybrid cloud company that serves artificial intelligence servicesa core business resembling Amazons AWS or Microsofts Azure.

Going from tabulating machines to electronic calculators to the first semiconductor-based computers to todays day of hybrid cloud and AI . . . that is a remarkable series of revolutions, not just evolutions, IBM chairman and CEO Arvind Krishna told Fast Company technology editor Harry McCracken during an interview recorded for theFast CompanyInnovation Festival.

At a practical level IBM makes its money selling business software and middleware, hosting the data and content of enterprises, helping enterprises manage data that lives on their own servers, and providing a range of services related to everything from healthcare AI to nanotechnology.

Krishna boiled down IBMs work as an effort to help businesses and organizations apply technology to their processes and systems. He gave examples, noting that IBM helped the Social Security Administration figure out peoples incomes so that it knew what to pay beneficiaries. It also helped a credit card company authorize payments without fraud, and it helps the federal reserve move trillions of dollars through the economy each day, he said.

Fast Company named IBM to its Most Innovative Companies list in 2020 for the companys on-location incubators, which are helping small startups and organizations transform their business via tech, such as a wireless company that was able to turn older cars into voice-enabled smart cars.

IBM is already preparing for its next reinvention. Quantum computing spent a long time in the realm of the theoretical. Now its in the realm of research labs, including IBMs. And soonvery soon, if you ask Krishnait will escape the lab and begin making a real difference in the business world. IBM, of course, wants to be front and center when that happens.

IBM finished its first quantum system, the IBM Q System One, in 2019. The System One is a 20-qubit system, meaning that it operates on 20 quantum bits. These qubits are the basic units of computing, comparable to the atomic-size bits used in regular computers. Bits, however, can exist only in two statesone or zero. Qubits are subatomic and move far faster than bits; quantum computers must operate in extremely cold temperatures so that the quantum state of the materials inside can be controlled. And most important, qubits can occupy two physical states at once. They can be zero and one at the same time. This is known as superposition in quantum physics, and it opens up new vistas of opportunity for scientists trying to model complex problems.

Superposition is why quantum computers may eventually be useful in solving problems that concern levels of risk, such as in logistics, supply chain management, or resource optimization.

All of these problems are probabilistic in nature, Krishna said. A digital computer works at hard zeros and onesyoure not trying to impose probability on it. Quantum computing is probabilistic by nature; it lives in that maybe/maybe-not state. And so those problems map naturally onto a quantum computer.

Krishna believes that quantum computers, including IBMs, will begin growing in size toward 1,000-qubit systems. We put out a road map saying a thousand cubits by the end of 2023, Krishna said. As the number of qubits grows, these supercomputers will begin being used to solve real business problemssuch as managing agricultural riskin the latter half of this decade.

And once it starts, its going to take off like a rocket ship . . ., Krishna told McCracken. Because lets suppose one capital markets institution uses it to get a better price arbitrage on some financial instrument, dont you think everybody else will want to do it then, instantly?

The larger the systems, the broader the applications and addressable problems. Krishna noted that the technical barriers that must be overcome to get from 20- to 1,000-qubit systems exist in the realm of engineering problems. But getting from a thousand to a million qubits is a different matter.

As we scale towards a million, I think weve got some fundamental issues in error correction, control, and maybe quantum physics that can rear their heads, he said, adding that even those problems are solvable.

IBM remains one of the most prolific research organizations in the world. It recently designed the first semiconductor based on a 2 nanometer manufacturing process, which will soon be produced by Samsung and others. Its likely that IBM will play a big role in the push toward big quantum computers that have the power to solve some of the worlds biggest problems.

Read more:
IBM CEO: Quantum computing will take off 'like a rocket ship' this decade - Fast Company

Infosys to Develop Quantum Computing Capabilities on AWS – HPCwire

BENGALURU,India,Sept. 22, 2021 Infosys,a global leader in next-generation digital services and consulting, today announced a strategic collaboration withAmazon Web Services(AWS) to develop quantum computing capabilities and use cases. Infosys will use Amazon Braket to explore and build multiple use cases in quantum computing as part ofInfosys Cobaltcloud offerings. Amazon Braket is a fully managed quantum computing service that helps scientists and developers get started with the technology and accelerate research and discovery.

Infosys will look to build, test, and evaluate quantum applications on circuit simulators and quantum hardware technologies using Amazon Braket. This will enable researchers and developers to experiment and study complex computational problems as quantum technologies continue to evolve. Enterprises will get access to use cases for rapid experimentation and can explore how quantum computing can potentially help them in the future in a variety of areas, assess new ideas and plan adoption strategies to drive innovation. The use of Amazon Braket by Infosys aims at getting businesses ready for a future where quantum computers will impact business.

TheInfosys Center for Emerging Technology Solutions(iCETS), which focuses on the incubation of next-generation services and offerings, is using Amazon Braket to develop quantum computing use cases in vehicle route optimization, fraud detection, and more. Infosys is also exploring partnership opportunities with startups in the quantum computing space through the Infosys Innovation Network (IIN). This collaboration further bolsters Infosys Cobalt, a set of services, solutions, and platforms for enterprises to accelerate their cloud journey. It offers 14,000 cloud assets and over 200 industry cloud solution blueprints.

Ravi Kumar S, President, Infosys, said, Infosys continues to be at the forefront of exploring and bringing new technologies to clients. Through our use of AWS in this space, we are bringing together the power of Amazon Braket and Infosys Cobalt to help enterprises build quantum computing capabilities and use cases to accelerate their cloud-powered transformation. We are exploring a variety of use cases from the logistics, finance, energy, and telecom sectors that can help clients evaluate future benefits and value that quantum computing could bring to their business. Enterprises can look forward to solving their various complex computational challenges with Infosys Cobalt and Amazon Braket.

Matt Garman, Senior Vice President of Sales & Marketing at Amazon Web Services, Inc, said, Quantum Computing is an area of intense research, and a number of businesses around the world are asking about its timeline and the opportunities that it could open. At this stage, its important to be aware and evaluate the potential future impact of quantum computing. Infosys, a long-standing for AWS Premier Consulting Partner, has experience in incubating emerging technology solutions. We see this collaboration as an important step towards setting the right expectations when discussing business problems with customers where quantum computing could have a role.

For more information on the collaboration, visit this link.

About Infosys

Infosys is a global leader in next-generation digital services and consulting. We enable clients in more than 50 countries to navigate their digital transformation. With overfour decades of experience in managing the systems and workings of global enterprises, we expertly steer our clients through their digital journey. We do it by enabling the enterprise with an AI-powered core that helps prioritize the execution of change. We also empower the business with agile digital at scale to deliver unprecedented levels of performance and customer delight. Our always-on learning agenda drives their continuous improvement through building and transferring digital skills, expertise, and ideas from our innovation ecosystem.

Visitwww.infosys.com to see how Infosyscan help your enterprise navigate your next.

Source: Infosys

Read the original here:
Infosys to Develop Quantum Computing Capabilities on AWS - HPCwire

The coevolution of particle physics and computing – Symmetry magazine

In the mid-twentieth century, particle physicists were peering deeper into the history and makeup of the universe than ever before. Over time, their calculations became too complex to fit on a blackboardor to farm out to armies of human computers doing calculations by hand.

To deal with this, they developed some of the worlds earliest electronic computers.

Physics has played an important role in the history of computing. The transistorthe switch that controls the flow of electrical signal within a computerwas invented by a group of physicists at Bell Labs. The incredible computational demands of particle physics and astrophysics experiments have consistently pushed the boundaries of what is possible. They have encouraged the development of new technologies to handle tasks from dealing with avalanches of data to simulating interactions on the scales of both the cosmos and the quantum realm.

But this influence doesnt just go one way. Computing plays an essential role in particle physics and astrophysics as well. As computing has grown increasingly more sophisticated, its own progress has enabled new scientific discoveries and breakthroughs.

Illustration by Sandbox Studio, Chicago with Ariel Davis

In 1973, scientists at Fermi National Accelerator Laboratory in Illinois got their first big mainframe computer: a 7-year-old hand-me-down from Lawrence Berkeley National Laboratory. Called the CDC 6600, it weighed about 6 tons. Over the next five years, Fermilab added five more large mainframe computers to its collection.

Then came the completion of the Tevatronat the time, the worlds highest-energy particle acceleratorwhich would provide the particle beams for numerous experiments at the lab. By the mid-1990s, two four-story particle detectors would begin selecting, storing and analyzing data from millions of particle collisions at the Tevatron per second. Called the Collider Detector at Fermilab and the DZero detector, these new experiments threatened to overpower the labs computational abilities.

In December of 1983, a committee of physicists and computer scientists released a 103-page report highlighting the urgent need for an upgrading of the laboratorys computer facilities. The report said the lab should continue the process of catching up in terms of computing ability, and that this should remain the laboratorys top computing priority for the next few years.

Instead of simply buying more large computers (which were incredibly expensive), the committee suggested a new approach: They recommended increasing computational power by distributing the burden over clusters or farms of hundreds of smaller computers.

Thanks to Intels 1971 development of a new commercially available microprocessor the size of a domino, computers were shrinking. Fermilab was one of the first national labs to try the concept of clustering these smaller computers together, treating each particle collision as a computationally independent event that could be analyzed on its own processor.

Like many new ideas in science, it wasnt accepted without some pushback.

Joel Butler, a physicist at Fermilab who was on the computing committee, recalls, There was a big fight about whether this was a good idea or a bad idea.

A lot of people were enchanted with the big computers, he says. They were impressive-looking and reliable, and people knew how to use them. And then along came this swarm of little tiny devices, packaged in breadbox-sized enclosures.

The computers were unfamiliar, and the companies building them werent well-established. On top of that, it wasnt clear how well the clustering strategy would work.

As for Butler? I raised my hand [at a meeting] and said, Good ideaand suddenly my entire career shifted from building detectors and beamlines to doing computing, he chuckles.

Not long afterward, innovation that sparked for the benefit of particle physics enabled another leap in computing. In 1989, Tim Berners-Lee, a computer scientist at CERN, launched the World Wide Web to help CERN physicists share data with research collaborators all over the world.

To be clear, Berners-Lee didnt create the internetthat was already underway in the form the ARPANET, developed by the US Department of Defense. But the ARPANET connected only a few hundred computers, and it was difficult to share information across machines with different operating systems.

The web Berners-Lee created was an application that ran on the internet, like email, and started as a collection of documents connected by hyperlinks. To get around the problem of accessing files between different types of computers, he developed HTML (HyperText Markup Language), a programming language that formatted and displayed files in a web browser independent of the local computers operating system.

Berners-Lee also developed the first web browser, allowing users to access files stored on the first web server (Berners-Lees computer at CERN). He implemented the concept of a URL (Uniform Resource Locator), specifying how and where to access desired web pages.

What started out as an internal project to help particle physicists share data within their institution fundamentally changed not just computing, but how most people experience the digital world today.

Back at Fermilab, cluster computing wound up working well for handling the Tevatron data. Eventually, it became industry standard for tech giants like Google and Amazon.

Over the next decade, other US national laboratories adopted the idea, too. SLAC National Accelerator Laboratorythen called Stanford Linear Accelerator Centertransitioned from big mainframes to clusters of smaller computers to prepare for its own extremely data-hungry experiment, BaBar. Both SLAC and Fermilab also were early adopters of Lees web server. The labs set up the first two websites in the United States, paving the way for this innovation to spread across the continent.

In 1989, in recognition of the growing importance of computing in physics, Fermilab Director John Peoples elevated the computing department to a full-fledged division. The head of a division reports directly to the lab director, making it easier to get resources and set priorities. Physicist Tom Nash formed the new Computing Division, along with Butler and two other scientists, Irwin Gaines and Victoria White. Butler led the division from 1994 to 1998.

These computational systems worked well for particle physicists for a long time, says Berkeley Lab astrophysicist Peter Nugent. That is, until Moores Law started grinding to a halt.

Moores Law is the idea that the number of transistors in a circuit will double, making computers faster and cheaper, every two years. The term was first coined in the mid-1970s, and the trend reliably proceeded for decades. But now, computer manufacturers are starting to hit the physical limit of how many tiny transistors they can cram onto a single microchip.

Because of this, says Nugent, particle physicists have been looking to take advantage of high-performance computing instead.

Nugent says high-performance computing is something more than a cluster, or a cloud-computing environment that you could get from Google or AWS, or at your local university.

What it typically means, he says, is that you have high-speed networking between computational nodes, allowing them to share information with each other very, very quickly. When you are computing on up to hundreds of thousands of nodes simultaneously, it massively speeds up the process.

On a single traditional computer, he says, 100 million CPU hours translates to more than 11,000 years of continuous calculations. But for scientists using a high-performance computing facility at Berkeley Lab, Argonne National Laboratory or Oak Ridge National Laboratory, 100 million hours is a typical, large allocation for one year at these facilities.

Although astrophysicists have always relied on high-performance computing for simulating the birth of stars or modeling the evolution of the cosmos, Nugent says they are now using it for their data analysis as well.

This includes rapid image-processing computations that have enabled the observations of several supernovae, including SN 2011fe, captured just after it began. We found it just a few hours after it exploded, all because we were able to run these pipelines so efficiently and quickly, Nugent says.

According to Berkeley Lab physicist Paolo Calafiura, particle physicists also use high-performance computing for simulationsfor modeling not the evolution of the cosmos, but rather what happens inside a particle detector. Detector simulation is significantly the most computing-intensive problem that we have, he says.

Scientists need to evaluate multiple possibilities for what can happen when particles collide. To properly correct for detector effects when analyzing particle detector experiments, they need to simulate more data than they collect. If you collect 1 billion collision events a year, Calafiura says, you want to simulate 10 billion collision events.

Calafiura says that right now, hes more worried about finding a way to store all of the simulated and actual detector data than he is about producing it, but he knows that wont last.

When does physics push computing? he says. When computing is not good enough We see that in five years, computers will not be powerful enough for our problems, so we are pushing hard with some radically new ideas, and lots of detailed optimization work.

Thats why the Department of Energys Exascale Computing Project aims to build, in the next few years, computers capable of performing a quintillion (that is, a billion billion) operations per second. The new computers will be 1000 times faster than the current fastest computers.

The exascale computers will also be used for other applications ranging from precision medicine to climate modeling to national security.

Innovations in computer hardware have enabled astrophysicists to push the kinds of simulations and analyses they can do. For example, Nugent says, the introduction of graphics processing units has sped up astrophysicists ability to do calculations used in machine learning, leading to an explosive growth of machine learning in astrophysics.

With machine learning, which uses algorithms and statistics to identify patterns in data, astrophysicists can simulate entire universes in microseconds.

Machine learning has been important in particle physics as well, says Fermilab scientist Nhan Tran. [Physicists] have very high-dimensional data, very complex data, he says. Machine learning is an optimal way to find interesting structures in that data.

The same way a computer can be trained to tell the difference between cats and dogs in pictures, it can learn how to identify particles from physics datasets, distinguishing between things like pions and photons.

Tran says using computation this way can accelerate discovery. As physicists, weve been able to learn a lot about particle physics and nature using non-machine-learning algorithms, he says. But machine learning can drastically accelerate and augment that processand potentially provide deeper insight into the data.

And while teams of researchers are busy building exascale computers, others are hard at work trying to build another type of supercomputer: the quantum computer.

Remember Moores Law? Previously, engineers were able to make computer chips faster by shrinking the size of electrical circuits, reducing the amount of time it takes for electrical signals to travel. Now our technology is so good that literally the distance between transistors is the size of an atom, Tran says. So we cant keep scaling down the technology and expect the same gains weve seen in the past."

To get around this, some researchers are redefining how computation works at a fundamental levellike, really fundamental.

The basic unit of data in a classical computer is called a bit, which can hold one of two values: 1, if it has an electrical signal, or 0, if it has none. But in quantum computing, data is stored in quantum systemsthings like electrons, which have either up or down spins, or photons, which are polarized either vertically or horizontally. These data units are called qubits.

Heres where it gets weird. Through a quantum property called superposition, qubits have more than just two possible states. An electron can be up, down, or in a variety of stages in between.

What does this mean for computing? A collection of three classical bits can exist in only one of eight possible configurations: 000, 001, 010, 100, 011, 110, 101 or 111. But through superposition, three qubits can be in all eight of these configurations at once. A quantum computer can use that information to tackle problems that are impossible to solve with a classical computer.

Fermilab scientist Aaron Chou likens quantum problem-solving to throwing a pebble into a pond. The ripples move through the water in every possible direction, simultaneously exploring all of the possible things that it might encounter.

In contrast, a classical computer can only move in one direction at a time.

But this makes quantum computers faster than classical computers only when it comes to solving certain types of problems. Its not like you can take any classical algorithm and put it on a quantum computer and make it better, says University of California, Santa Barbara physicist John Martinis, who helped build Googles quantum computer.

Although quantum computers work in a fundamentally different way than classical computers, designing and building them wouldnt be possible without traditional computing laying the foundation, Martinis says. We're really piggybacking on a lot of the technology of the last 50 years or more.

The kinds of problems that are well suited to quantum computing are intrinsically quantum mechanical in nature, says Chou.

For instance, Martinis says, consider quantum chemistry. Solving quantum chemistry problems with classical computers is so difficult, he says, that 10 to 15% of the worlds supercomputer usage is currently dedicated to the task. Quantum chemistry problems are hard for the very reason why a quantum computer is powerfulbecause to complete them, you have to consider all the different quantum-mechanical states of all the individual atoms involved.

Because making better quantum computers would be so useful in physics research, and because building them requires skills and knowledge that physicists possess, physicists are ramping up their quantum efforts. In the United States, the National Quantum Initiative Act of 2018 called for the National Institute of Standards and Technology, the National Science Foundation and the Department of Energy to support programs, centers and consortia devoted to quantum information science.

In the early days of computational physics, the line between who was a particle physicist and who was a computer scientist could be fuzzy. Physicists used commercially available microprocessors to build custom computers for experiments. They also wrote much of their own softwareranging from printer drivers to the software that coordinated the analysis between the clustered computers.

Nowadays, roles have somewhat shifted. Most physicists use commercially available devices and software, allowing them to focus more on the physics, Butler says. But some people, like Anshu Dubey, work right at the intersection of the two fields. Dubey is a computational scientist at Argonne National Laboratory who works with computational physicists.

When a physicist needs to computationally interpret or model a phenomenon, sometimes they will sign up a student or postdoc in their research group for a programming course or two and then ask them to write the code to do the job. Although these codes are mathematically complex, Dubey says, they arent logically complex, making them relatively easy to write.

A simulation of a single physical phenomenon can be neatly packaged within fairly straightforward code. But the real world doesnt want to cooperate with you in terms of its modularity and encapsularity, she says.

Multiple forces are always at play, so to accurately model real-world complexity, you have to use more complex softwareideally software that doesnt become impossible to maintain as it gets updated over time. All of a sudden, says Dubey, you start to require people who are creative in their own rightin terms of being able to architect software.

Thats where people like Dubey come in. At Argonne, Dubey develops software that researchers use to model complex multi-physics systemsincorporating processes like fluid dynamics, radiation transfer and nuclear burning.

Hiring computer scientists for research projects in physics and other fields of science can be a challenge, Dubey says. Most funding agencies specify that research money can be used for hiring students and postdocs, but not paying for software development or hiring dedicated engineers. There is no viable career path in academia for people whose careers are like mine, she says.

In an ideal world, universities would establish endowed positions for a team of research software engineers in physics departments with a nontrivial amount of computational research, Dubey says. These engineers would write reliable, well-architected code, and their institutional knowledge would stay with a team.

Physics and computing have been closely intertwined for decades. However the two developtoward new analyses using artificial intelligence, for example, or toward the creation of better and better quantum computersit seems they will remain on this path together.

Original post:
The coevolution of particle physics and computing - Symmetry magazine