Archive for the ‘Quantum Computing’ Category

Quantum computing just might save the planet – McKinsey

The emerging technology of quantum computingcould revolutionize the fight against climate change, transforming the economics of decarbonization and becoming a major factor in limiting global warming to the target temperature of 1.5C (see sidebar What is quantum computing?).

Even though the technology is in the early stages of developmentexperts estimate the first generation of fault-tolerant quantum computing will arrive in the second half of this decadebreakthroughs are accelerating, investment dollars are pouring in, and start-ups are proliferating. Major tech companies have already developed small, so-called noisy intermediate-scale quantum (NISQ) machines, though these arent capable of performing the type of calculations that fully capable quantum computers are expected to perform.

Countries and corporates set ambitious new targets for reducing emissions at the 2021 United Nations Climate Change Conference (COP26). Those goals, if fully met, would represent an extraordinary annual investment of $4 trillion by 2030, the largest reallocation of capital in human history. But the measures would only reduce warming to between 1.7C and 1.8C by 2050, far short of the 1.5C level believed necessary to avoid catastrophic, runaway climate change.

Meeting the goal of net-zero emissions that countries and some industries have committed to wont be possible without huge advances in climate technology that arent achievable today. Even the most powerful supercomputers available now are not able to solve some of these problems. Quantum computing could be a game changer in those areas. In all, we think quantum computing could help develop climate technologies able to abate carbon on the order of 7 gigatons a year of additional CO2 impact by 2035, with the potential to bring the world in line with the 1.5C target.

Quantum computing could help reduce emissions in some of the most challenging or emissions-intensive areas, such as agriculture or direct-air capture, and could accelerate improvements in technologies required at great scale, such as solar panels or batteries. This article offers a look at some of the breakthroughs the technology could permit and attempts to quantify the impact of leveraging quantum-computer technology that are expected become available this decade.

Quantum computing could bring about step changes throughout the economy that would have a huge impact on carbon abatement and carbon removal, including by helping to solve persistent sustainability problems such as curbing methane produced by agriculture, making the production of cement emissions-free, improving electric batteries for vehicles, developing significantly better renewable solar technology, finding a faster way to bring down the cost of hydrogen to make it a viable alternative to fossil fuels, and using green ammonia as a fuel and a fertilizer.

Addressing the five areas designated in the Climate Math Reportas key for decarbonization, we have identified quantum-computing use cases that can pave the way to a net-zero economy. We project that by 2035 the use cases listed below could make it possible to eliminate more than 7 gigatons of CO2 equivalent (CO2e) from the atmosphere a year, compared with the current trajectory, or in aggregate more than 150 gigatons over the next 30 years (Exhibit 1).

Exhibit 1

Batteries are a critical element of achieving zero-carbon electrification. They are required to reduce CO2 emissions from transportation and to obtain grid-scale energy storage for intermittent energy sources such as solar cells or wind.

Improving the energy density of lithium-ion (Li-ion) batteries enables applications in electric vehicles and energy storage at an affordable cost. Over the past ten years, however, innovation has stalledbattery energy density improved 50 percent between 2011 and 2016, but only 25 percent between 2016 and 2020, and is expected to improve by just 17 percent between 2020 and 2025.

Recent research has shown that quantum computing will be able to simulate the chemistry of batteries in ways that cant be achieved now. Quantum computing could allow breakthroughs by providing a better understanding of electrolyte complex formation, by helping to find a replacement material for cathode/anode with the same properties and/or by eliminating the battery separator.

As a result, we could create batteries with 50 percent higher energy density for use in heavy-goods electric vehicles, which could substantially bring forward their economic use. The carbon benefits to passenger EVs wouldnt be huge, as these vehicles are expected to reach cost parity in many countries before the first generation of quantum computers is online, but consumers might still enjoy cost savings.

In addition, higher-density energy batteries can serve as a grid-scale storage solution. The impact on the worlds grids could be transformative. Halving the cost of grid-scale storage could enable a step change in the use of solar power, which is becoming economically competitive but is challenged by its generation profile. Our modeling suggests that halving the cost of solar panels could increase their use by 25 percent in Europe by 2050 but halving both solar and batteries might increase solar use by 60 percent (Exhibit 2). Geographies without such a high carbon price will see even greater impacts.

Exhibit 2

Through the combination of use cases described above, improved batteries could bring about an additional reduction in carbon dioxide emissions of 1.4 gigatons by 2035.

Many parts of the industry produce emissions that are either extremely expensive or logistically challenging to abate.

Cement is a case in point. During calcination in the kiln for the process of making clinker, a powder used to make cement, CO2 is released from raw materials. This process accounts for approximately two-thirds of cement emissions.

Alternative cement-binding materials (or clinkers) can eliminate these emissions, but theres currently no mature alternative clinker that can significantly reduce emissions at an affordable cost.

There are many possible permutations for such a product, but testing by trial and error is time-consuming and costly. Quantum computing can help to simulate theoretical material combinations to find one that overcomes todays challengesdurability, availability of raw materials and efflorescence (in the case of alkali-activated binders). This would have an estimated additional impact of 1 gigaton a year by 2035.

Solar cells will be one of the key electricity-generation sources in a net-zero economy. But even though they are getting cheaper, they still are far from their theoretical maximum efficiency.

Todays solar cells rely on crystalline silicon and have an efficiency on the order of 20 percent. Solar cells based on perovskite crystal structures, which have a theoretical efficiency of up to 40 percent, could be a better alternative. They present challenges, however, because they lack long-term stability and could, in some varieties, be more toxic. Furthermore, the technology has not been mass produced yet.

Quantum computing could help tackle these challenges by allowing for precise simulation of perovskite structures in all combinations using different base atoms and doping, thereby identifying higher efficiency, higher durability, and nontoxic solutions. If the theoretical efficiency increase can be reached, the levelized cost of electricity (LCOE) would decrease by 50 percent.

By simulating the impact of cheaper and more efficient quantum-enabled solar panels, we see a significant increase in use in areas with lower carbon prices (China, for example). This is also true of countries in Europe with high irradiance (Spain, Greece) or poor conditions for wind energy (Hungary). The impact is magnified when combined with cheap battery storage, as discussed above.

This technology could abate an additional 0.4 gigatons of CO2 emissions by 2035.

Hydrogen is widely considered to be a viable replacement for fossil fuels in many parts of the economy, especially in industry where high temperature is needed and electrification isnt possible or sufficient, or where hydrogen is needed as a feedstock, such as steelmaking or ethylene production.

Before the 2022 gas price spikes, green hydrogen was about 60 percent more expensive than natural gas. But improving electrolysis could significantly decrease the cost of hydrogen.

Polymer electrolyte membrane (PEM) electrolyzers split water and are one way to make green hydrogen. They have improved in recent times but still face two major challenges.

Quantum computing can help model the energy state of pulse electrolysis to optimize catalyst usage, which would increase efficiency. Quantum computing could also model the chemical composition of catalysts and membranes to ensure the most efficient interactions. And it could push the efficiency of the electrolysis process up to 100 percent and reduce the cost of hydrogen by 35 percent. If combined with cheaper solar cells discovered by quantum computing (discussed above), the cost of hydrogen could be reduced by 60 percent (Exhibit 3).

Exhibit 3

Increased hydrogen use as a result of these improvements could reduce CO2 emissions by an additional 1.1 gigatons by 2035.

Ammonia is best known as a fertilizer, but could also be used as fuel, potentially making it one of the best decarbonization solutions for the worlds ships. Today, it represents 2 percent of total global final energy consumption.

For the moment, ammonia is made through the energy-intensive Haber-Bosch process using natural gas. There are several options for creating green ammonia, but they rely on similar processes. For example, green hydrogen can be used as a feedstock, or the carbon dioxide emissions that are caused by the process can be captured and stored.

However, there are other potential approaches, such as nitrogenase bioelectrocatalysis, which is how nitrogen fixation works naturally when plants take nitrogen gas directly from the air and nitrogenase enzymes catalyze its conversion into ammonia. This method is attractive because it can be done at room temperature and at 1 bar pressure, compared with 500C at high pressure using Haber-Bosch, which consumes large amounts of energy (in the form of natural gas) (Exhibit 4).

Exhibit 4

Innovation has reached a stage where it might be possible to replicate nitrogen fixation artificially, but only if we can overcome challenges such as enzyme stability, oxygen sensitivity, and low rates of ammonia production by nitrogenase. The concept works in the lab but not at scale.

Quantum computing can help simulate the process of enhancing the stability of the enzyme, protecting it from oxygen and improving the rate of ammonia production by nitrogenase. That would result in a 67 percent cost reduction over todays green ammonia produced through electrolysis, which would make green ammonia even cheaper than traditionally produced ammonia. Such a cost reduction could not only lessen the CO2 impacts of the production of ammonia for agricultural use but could also bring forward the breakeven for ammonia in shippingwhere it is expected to be a major decarbonization optionforward by ten years.

Using quantum computing to facilitate cheaper green ammonia as a shipping fuel could abate an additional CO2 by 0.4 gigatons by 2035.

Carbon capture is required to achieve net zero. Both types of carbon capturepoint source and directcould be aided by quantum computing.

Point-source carbon capture allows CO2 to be captured directly from industrial sources such as a cement or steel blast furnace. But the vast majority of CO2 capture is too expensive to be viable for now, mainly because it is energy intense.

One possible solution: novel solvents, such as water-lean and multiphase solvents, which could offer lower-energy requirements, but it is difficult to predict the properties of the potential material at a molecular level.

Quantum computing promises to enable more accurate modeling of molecular structure to design new, effective solvents for a range of CO2 sources, which could reduce the cost of the process by 30 to 50 percent.

We believe this has significant potential to decarbonize industrial processes, which could lead to additional decarbonization of up to 1.5 gigatons a year, including cement. If the cement clinker approach described above is successful, this would still have an effect of 0.5 gigatons a year, due to fuel emissions. In addition, alternative clinkers may not be available in some regions.

Direct-air capture, which involves sucking CO2 from the air, is a way to address carbon removals. While the Intergovernmental Panel on Climate Change says this approach is required to achieve net zero, it is very expensive (ranging from $250 to $600 per ton a day today) and even more energy intensive than point-source capture.

Adsorbents are best suited for effective direct-air capture and novel approaches, such as metal organic frameworks, or MOFs, have the potential to greatly reduce the energy requirements and the capital cost of the infrastructure. MOFs act like a giant spongeas little as a gram can have a surface area larger than a football fieldand can absorb and release CO2 at far lower temperature changes than conventional technology.

Quantum computing can help advance research on novel adsorbents such as MOFs and resolve challenges arising from sensitivity to oxidation, water, and degradation caused by CO2.

Novel adsorbents that have a higher adsorption rate could reduce the cost of technology to $100 per ton of CO2e captured. This could be a critical threshold for uptake, given that corporate climate leaders such as Microsoft have publicly announced an expectation to pay $100 a ton long term for the highest-quality carbon removals. This would lead to an additional CO2reduction of 0.7 gigatons a year by 2035.

Twenty percent of annual greenhouse-gas emissions come from agricultureand methane emitted by cattle and dairy is the primary contributor (7.9 gigatons of CO2e, based on 20-year global-warming potential).

Research has established that low-methane feed additives could effectively stop up to 90 percent of methane emissions. Yet applying those additives for free-range livestock is particularly difficult.

An alternative solution is an antimethane vaccine that produces methanogen-targeting antibodies. This method has had some success in lab conditions, but in a cows gutchurning with gastric juices and foodthe antibodies struggle to latch on to the right microbes. Quantum computing could accelerate the research to find the right antibodies by precise molecule simulation instead of a costly and long trial-and-error method. With estimated uptake determined according to data from the US Environmental Protection Agency, we arrive at carbon reduction of up to an additional 1 gigaton a year by 2035.

Another prominent use case in agriculture is green ammonia discussed as a fuel above, where todays Haber-Bosch process uses large amounts of natural gas. Using such an alternative process could have an additional impact of up to 0.25 gigatons a year by 2035, replacing current conventionally produced fertilizers.

There are many more ways that quantum computing could be applied to the fight against climate change. Future possibilities include identification of new thermal-storage materials, high-temperature superconductors as a future base for lower losses in grids, or simulations to support nuclear fusion. Use cases arent limited to climate mitigation, but can also apply to adaptation, for example, improvements in weather prediction to give greater warning of major climatic events. But progress on those innovations will have to wait because first-generation machines will not be powerful enough for such breakthroughs (see sidebar Methodology).

The leap in CO2 abatement could be a major opportunity for corporates. With $3 to $5 trillion in value at stake in sustainability, according to McKinsey research, climate investment is an imperative for big companies. The use cases presented above represent major shifts and potential disruptions in these areas, and they are associated with huge value for players who take the lead. This opportunity is recognized by industry leaders who are already developing capabilities and talent.

Nevertheless, quantum technology is in the early stage and comes with the risks linked to leading-edge technology development, as well as tremendous cost. We have highlightedthe stage of the industry in the Quantum Technology Monitor. The risk to investors can be mitigated somewhat through steps such as onboarding technical experts to run in-depth diligence, forming joint investments with public entities or consortia, and investing in companies that bundle various ventures under one roof and provide the necessary experience to set up and scale these ventures.

In addition, governments have an important role to play by creating programs at universities to develop quantum talent and by providing incentives for quantum innovation for climate, particularly for use cases that today do not have natural corporate partners, such as disaster prediction, or that arent economical, such as direct-air capture. Governments could start more research programs like the partnership between IBM and the United Kingdom, the collaboration between IBM and Fraunhofer-Gesellschaft, the publicprivate partnership Quantum Delta in the Netherlands, and the collaboration between the United States and the United Kingdom. By tapping into quantum computing for sustainability, countries will accelerate the green transition, achieve national commitments, and get a head start in export markets. But even with those measures, the risk and expense remain high (Exhibit 5).

Exhibit 5

Here are some questions corporates and investors need to ask before taking a leap into quantum computing.

Is quantum computing relevant for you?

Determine whether there are use cases that can potentially disrupt your industry or your investments and address the decarbonization challenges of your organization. This article has highlighted anecdotal use cases across several categories to showcase the potential impact of quantum computing, but weve identified more than 100 sustainability-relevant use cases where quantum computing could play a major role. Quickly identifying use cases that are applicable to you and deciding how to address them can be highly valuable, as talent and capacity will be scarce in this decade.

How do I approach quantum computing now, if it is relevant?

Once you have engaged on quantum computing, building the right kind of approach, mitigating risk and securing access to talent and capacity are key.

Because of the high cost of this research, corporates can maximize their impact by forming partnerships with other players from their value chains and pooling expense and talent. For example, major consumers of hydrogen might join up with electrolyzer manufacturers to bring down the cost and share the value. These arrangements will require companies to figure out how to share innovation without losing competitive advantage. Collaborations such as joint ventures or precompetitive R&D could be an answer. We also foresee investors willing to support such endeavors to potentially remove some of the risk for corporates. And there are large amounts of dedicated climate finance available, judging by pledges made at COP26 that aim to reach the target of $100 billion a year in spending.

Do I have to start now?

While the first fault-tolerant quantum computer is several years away, it is important to start development work now. There is significant prework to be done to get to a maximal return on the significant investment that application of quantum computing will require.

Determining the exact parameters of a given problem and finding the best possible application will mean collaboration between application experts and quantum-computing technicians well versed in algorithm development. We estimate algorithm development would take up to 18 months, depending on the complexity.

It will also take time to set up the value chain, production, and go-to-market to ensure they are ready when quantum computing can be deployed and to fully benefit from the value created.

Quantum computing is a revolutionary technology that could allow for precise molecular-level simulation and a deeper understanding of natures basic laws. As this article shows, its development over the next few years could help solve scientific problems that until recently were believed to be insoluble. Clearing away these roadblocks could make the difference between a sustainable future and climate catastrophe.

Making quantum computing a reality will require an exceptional mobilization of resources, expertise, and funds. Only close cooperation between governments, scientists, academics, and investors in developing this technology can make it possible to reach the target for limiting emissions that will keep global warming at 1.5C and save the planet.

More:
Quantum computing just might save the planet - McKinsey

Seun Omonije took the gridiron-to-quantum-computing route – Yale News

Growing up as a scholar-athlete in Texas, Seun Omonije already knew plenty of football moves before he arrived in New Haven nearly four years ago.

But Yale taught the 22-year-old graduating senior from Silliman College a new move the quantum pivot.

In 2020, after the Ivy League canceled the football season due to public health concerns over COVID-19, Omonije, a wide receiver, was able to shift more of his attention to his computer science major. That decision, he says, led him to new friendships, new research opportunities, and a clear path to a career in quantum computing.

I made a pivot, Omonije says. I still trained my hardest, physically, but I kept doing more coding, kept learning new quantum concepts. I found ways to apply myself to other things.

Omonije is a founding board member of the Yale Undergraduate Quantum Computing (YuQC) group whose 2021 Quantum Coalition Hack with Stanford attracted 2,100 entrants from 80 countries and he served as a teaching assistant for a software engineering course on campus.

He has completed a pair of software engineering internships at Google, one working remotely with Google Cloud in Los Angeles and the other with Google Quantum AI in the San Francisco Bay area. Omonije helped build the first tool for 3D visualizations of quantum circuits among other models in quantum theory using Python and Typescript. After graduation, hell be moving to Los Angeles and taking a full-time job at Google Quantum AI.

He credits Yale for his quantum cred.

The faculty and coursework exposed me to so many areas of fundamental quantum theory and computation, and ways I can apply those concepts, Omonije says. I was actually doing research on quantum control systems and quantum software. Yale showed me what was possible in quantum computing.

Now that hes nearly finished his senior year a year in which injuries kept him off the playing field during games Omonije has begun to reflect on his Yale experience. He said he is grateful for the friendships hes made among his football teammates, his close-knit community at Silliman, and his fellow students at YuQC. Likewise, hes glad to have had the chance to sample so much that Yale had to offer, including fine arts and performing arts.

When I came to Yale, I didnt know what I wanted to do or who I wanted to be, he says. Im proud of the personal growth I experienced, that through all of the stuff that didnt go the way I planned, I still accomplished so many things.

All things considered, this is just the beginning of my journey, and Im looking forward to putting my head down and getting to work on this next chapter of life.

The rest is here:
Seun Omonije took the gridiron-to-quantum-computing route - Yale News

SAP innovation wing aims to shape future of ERP technology – TechTarget

Few people understand SAP's research into emerging technologies like blockchain, AI, quantum computing and the metaverse better than Martin Heinig and Yaad Oren.

Heinig heads New Ventures and Technologies, a group of several hundredpeople working in labs to define SAP innovation and long-term strategy. "We look at technologies that have the potential to disrupt the market," he said.

Oren heads a subgroup, the SAP Innovation Center Network, which he likened to a high school for research projects. "Once they graduate, they move into the real world," he said.

Heinig's is one group within an SAP R&D function that he said is divided into three parts: academic groups like the Hasso Plattner Institute, namesake of SAP's co-founder, which takes the longest view, and product engineering, which operates on a roughly two-year timeframe.

"We sit right in the middle," Heinig said. "We're looking at opportunities for SAP that are five-plus years out and then try to figure out what something can be. We create prototypes to find out if we can really build it and if is feasible for customers."

A common theme of these technological investigations emerged during separate interviews with Heinig and Oren at this month's SAP Sapphire 2022 conference in Orlando. Much of the work serves the ambitious goal of extending business processes beyond the walls of an organization. Doing so requires breaking business processes into smaller pieces that can be securely shared between software systems and corporate entities. Besides being shareable, these new processes are designed to be autonomous, "smart" and composable so they can be endlessly reconfigured to do exactly what people need them to do.

The interviews are combined and edited for brevity and clarity. Heinig and Oren both were emphatic that mentioning a technology does not mean SAP is committed to productizing it.

What important technologies are you working on that could lead to major changes in ERP?

Martin Heinig: Quantum computing, definitely, but it can also be simulated quantum computers like digital annealers, where we have a step change in computing power that can open up new scenarios. For example, in the supply chain, when you have optimization problems that would not take hours but minutes.

Things like homomorphic encryption can also be a game changer. The beauty of it is you can do analytics on encrypted data, so it will not reveal the actual information but you can still do some basic calculations. For example, I can give you sales data, but it would be encrypted so you don't know the company that I'm working with, but you would see the order number or quantity. Whole industries could package the data and do analytics. It could be interesting in the healthcare sector, where you don't want to reveal patient names.

The problem is it still requires a lot of computing time, so we need to go into the hardware space with partners and find out if there are some technologies, like specific chips, that can be a kind of coprocessor to minimize the penalty on the computing side.

When do you think quantum computing could be practical for business use?

Heinig: We see a lot of progress, and the number of qubits is increasing tremendously, but we have not found a quantum computer that can solve real-world problems yet. It's hard to estimate, but it's not 10 years out. Maybe the first real use cases are three to five years out.

We're currently testing how it would work. The scenarios we are looking at are more in the optimization space, like supply chain warehouse management or production planning, where you have a lot of very complex problems to solve that need a lot of compute power. We try out how to translate these kinds of problems into quantum computing language.

Why should people care? What's going to be so great about quantum computing?

Yaad Oren: It's a whole new paradigm for computing. It's not only the computation but how people will build software.

With classical programming, you interact with the processor in a certain way. If you have a quantum processor, you interact differently. Even the development languages get affected.

The disruptive potential is across the stack, from infrastructure to platform to software development languages.

There is a lot of hype, but SAP is currently looking at three areas where we see the potential for disruption.

The first is optimization. Quantum computing is not good for arithmetic, like one plus one equals two, but it's very good for combinatorial problems like the traveling salesman problem, when you have many nodes and a factorial level of complexity.

Optimization problems fit quantum computing like a glove -- for example, supply chain optimization, when you have so many parameters to evaluate regarding the route, pandemic regulations and weather.

We're using a lot of quantum simulations -- partner solutions, quantum annealing and other technologies -- because the quantum computer is not there yet. SAP is also involved in a government-funded project with another German company on building quantum as a service. We use a lot of simulation technologies that have already helped us understand the power of this.

The second is called post-quantum cryptography, which is the security and encryption aspect of quantum, a big thing given the number of phishing attacks, ransomware and password hacks.

Quantum holds a lot of promise to create encryption at a level the industry has never known before. It's about creating a new type of password that is not breakable. The quantum code is the means to the end.

In browsers today, you have auto-generated passwords that are done by algorithms. Quantum computing can give birth to new types of algorithms that create passwords at a new level of complexity.

Eventually any password is breakable if you spend enough time. With quantum computing, this becomes much harder or impossible. Of course, it's a matter of time until hackers catch up.

The third benefit of quantum computing is AI. Machine learning is based on unique data, and you need computation power to train models. With quantum computing, you can create new types of AI models and applications that you couldn't train before, because now you have a strong computer that can learn more and solve new problems. It will give birth to new types of automation and predictive analytics.

What kind of research are you doing in AI and machine learning?

Heinig: Enterprise knowledge graphs are a concept based on machine learning technology that we're looking into. It's basically the idea of modeling the connections between business objects and bringing in the relationships between them. This is a very important technology for creating context for situational awareness and personalization.

Oren: Regarding the future of AI and analytics, we have a lot of advances in this area. SAP is also focusing on infusing AI into the core application.

We are working a lot on the future of planning and introducing new types of AI like reinforcement learning to create new types of simulations.

Today, if you want to have planning solutions, you cannot always get the full perspective on uncertainties, and you cannot have recommendations and simulations for scenarios that you didn't ask about.

We're working on a self-learning system that provides continuous intelligence. It's not a product yet, but we're working on it with customers. You don't need to train the model and build the machine learning model yourself. It can keep learning even in areas that you didn't explicitly ask to explore, to fight uncertainty. This was requested by customers during the coronavirus pandemic and all the disruption in supply chain, where you need to deal with a lot of uncertainty.

Does the metaverse have implications for ERP, realistically?

Heinig: Yes, but the question is when and to what extent? What does it mean from a process perspective? Companies have already crossed the borders between physical goods and digital goods. The basic research question that we are looking into is how can we make these processes seamless?

You have your physical store where you sell physical goods and you have a digital store. You can sell physical goods, order them and get them delivered. So how can we extend this so you can also sell in your digital store a physical good with a digital good, like a non-fungible token (NFT)? No matter where you want to do business in the metaverse, the ERP system should help you run your processes.

Where do you see the most promising business applications of the metaverse?

Oren: The metaverse is also a lot of hype, and we need to distill the noise. For me, the magic happens more in the practical -- I would even say boring -- side of the metaverse, not the avatars and the UIs.

We're looking more at the Web 3.0 aspects of things. Web. 3.0 is kind of what greases the skids of the metaverse -- for example, all the crypto payments.

Of the top three long-term observations we have about the metaverse, the first one is everything regarding digital finance. There is huge demand from the industry. The number of transactions and volume of the new generation of buyers is huge -- how you accept crypto payments and how you sell NFTs.

Let's say an avatar is buying something. You need a profit and loss statement that can take fiat money and crypto money. How you do the balance sheets and audit them may not be sexy, but it's really important.

The metaverse is a combination of real-world technologies and the digital world. How do you do analytics and planning if you have functionalities and workflows and things that are both digital and real? Those are different areas that need a bridge between them.

The third thing is the augmented employee. They're going to have digital representations in the metaverse. We're evaluating how you can connect those representations into the enterprise system from SAP SuccessFactors to all the other data sources you have in the organization.

Some members of the U.S. Congress asked the Environmental Protection Agency to consider regulating bitcoin miners because they use a lot of compute power and water. Are you trying to improve the efficiency of blockchain?

Heinig: We take this into consideration, especially proof-of-work authentication, which is very energy consuming, but it's not what we research. It's more about how would we use blockchain technology, hopefully in a very energy-efficient way, to find customer use cases we can enable with blockchain.

One good example would be a green token for tracing raw materials, using the concept of tokenization and blockchain technologies in cross-company scenarios.

Another idea is cross-company workflows. For example, how you can have different process steps across companies and across systems, store them and make sure they are auditable.

A third example would be self-sovereign identity (SSI). The idea is you store your identity in a personal wallet. Today you have a central register where you store the identities and proof of identity against one central database. A good example is when you use your Google identity to log in at different websites.

The idea is similar to a bitcoin wallet. You would have credentials that are verifiable in your personal wallet, and you can verify yourself against different systems. We would make sure it is auditable so you always know that an identity is real.

The beauty of this concept is that you can work with different systems seamlessly. Maybe in the future it would be a way to have more personalized experiences with systems because it could also store information that a system could use to personalize your experience.

What blockchain mechanisms are you looking at for connecting business processes and building trust?

Oren: It started with onboarding. Let's take a supply chain or order-to-cash -- any process with many vendors. Today, when you onboard a new player to a business network, there are a lot of time-consuming manual steps and authorization. The mechanism we use, self-sovereign identity, harnesses the power of a blockchain so that everything is auditable and immutable. You can quickly onboard vendors to the network, supply chain and any process.

With blockchain tokens, you can onboard vendors with ease because everything is documented. In enterprise processes like order-to-cash, any step, like when you deliver something -- let's say you're manufacturing an engine, to use an example from the keynote -- you deliver the piston, someone else provides another component. For anything you send between vendors, you need to have proof-of-delivery documents, which are legal documents. You need to call a lawyer and have a notary service sign the document. It is paper based.

Using the token, you can do self-authorization. You don't need to call those legal services to sign documents. Using the blockchain, everything is immediate, auditable and transparent. It's part of a proof of concept. It's not a product but they talk about it publicly.

We also have this carbon data network project that was also mentioned in the keynote where you have track and trace to see the CO2 emissions of each part by each vendor in each stage of the supply chain.

You are doing some research on composable business processes. What specifically are you looking at?

Heinig: We have 50 years of business knowledge that's basically all packaged in our S/4HANA system. How do we find a way to make it composable to make it more flexible and include easier third-party solutions?

Integration today is basically happening on a technical level, but we would like to lift it up on the business process level. Today we sell software that's packaged, and you have the business processes inside the software. But I think we should change that so we would sell you business processes and you would not even need to bother with what kind of software you're using, because these would be packaged, orchestrated functions that are already pre-integrated.

Oren: If you have, let's say, a need in order-to-cash for a new type of vendor verification or some compliance, it should be very easy -- like plug and play -- to add services from SAP or not. We want to have this orchestration layer of having two services working together. This is something that requires a lot of technical underpinning to both have the abstraction and orchestration of services to work together.

Analytics is a major focus of your group. Why is analytics worth looking at?

Heinig: It's basically analytics plus planning, and we see two major differences arising.

One is the role of ERP systems and business networks. Let's take sustainability KPIs. It's not enough to try to analyze and optimize them on a company level. You need to look at the whole supply chain on your business network. This means your analytics capabilities need to go cross-company.

From a planning perspective, if you really want to optimize it, you also need to have these planning capabilities along your whole supply chain. This is where things get really, really complicated.

The second one is around how can we lift up analytics and planning to the next level? Today, it's really manual and static. You look at your dashboards and maybe find some anomalies and try to react.

We're trying to change this so it's possible for the system to automatically detect anomalies in data flows and trigger creation of a dashboard that is personalized to your role in the company. The system says, 'we found something, please have a look at it, and these are your three most appropriate options.'

More here:
SAP innovation wing aims to shape future of ERP technology - TechTarget

RIT offers new minor in emerging field of quantum information science and technology | RIT – Rochester Institute of Technology

Rochester Institute of Technology students can soon begin earning a minor in an emerging field that could disrupt the science, technology, engineering, and math (STEM) disciplines. RIT students can now take classes toward a minor in quantum information and technology science.

This is a hot field garnering a lot of attention and we are excited to offer students a chance to gain some technical depth in quantum so they can take this knowledge and go the next step with their careers, said Ben Zwickl, associate professor in RITs School of Physics and Astronomy and advisor for the minor. It will provide a pathway for students from any STEM major to take two core courses that introduce them to quantum and some of its applications, as well as strategically pick some upper-level courses within or outside their program.

Quantum physics seeks to understand the rules and effects of manipulating the smallest amount of energy at the subatomic level. Scientists and engineers are attempting to harness the strange, unintuitive properties of quantum particles to make advances in computing, cryptography, communications, and many other applications. Developers of the minor said there is a growing industry that will need employees knowledgeable about quantum physics and its applications.

Were seeing a lot of giant tech companies like IBM, Intel, Microsoft, and Google get involved with quantum, but theres also a lot of venture capital going to startup companies in quantum, said Gregory Howland, assistant professor in the School of Physics and Astronomy. Howland will teach one of the minors two required courses this fallPrinciples and Applications of Quantum Technology. You have both sides of it really blossoming now.

The minor, much like the field itself, is highly interdisciplinary in nature, with faculty from the College of Science, Kate Gleason College of Engineering, College of Engineering Technology, and Golisano College of Computing and Information Sciences offering classes that count toward the minor. The minor grew out of RITs Future Photon Initiative and funding from the NSFs Quantum Leap Challenge Institutes program.

Associate Professor Sonia Lopez Alarcon from RITs Department of Computer Engineering will teach the other required courseIntroduction to Quantum Computing and Information Sciencestarting this spring. She said taking these courses will provide valuable life skills in addition to lessons about cutting-edge science and technology.

Theyll learn more than just the skills from the courses, theyll learn how to get familiar with a topic thats not in the textbooks officially yet, said Lopez Alarcon. Thats a very important skill for industry. Companies want to know theyre hiring people with the ability to learn about something that is emerging, especially in science and technology because its such a rapidly changing field.

The faculty involved noted that they hope to attract a diverse group of students to enroll in the minor. They said that although the disciplines feeding into quantum have struggled with inclusion related to gender and race and ethnicity, they will work with affinity groups on campus to try to recruit students to the program and ultimately advance the fields inclusivity.

To learn more about the minor, contact Ben Zwickl.

Continue reading here:
RIT offers new minor in emerging field of quantum information science and technology | RIT - Rochester Institute of Technology

ANL Special Colloquium on The Future of Computing – HPCwire

There are, of course, a myriad of ideas regarding computings future. At yesterdays Argonne National Laboratorys Directors Special Colloquium, The Future of Computing, guest speaker Sadasivan Shankar, did his best to convince the audience that the high-energy cost of the current computing paradigm not (just) economic cost; were talking entropy here is fundamentally undermining computings progress such that it will never be able to solve todays biggest challenges.

The broad idea is that the steady abstracting away of informational content from each piece of modern computings complicated assemblage (chips, architecture, programming) inexorably increases the cumulative energy cost, leading toward a hard ceiling. Leaving aside, for a moment, the decline in Moores law (just a symptom really), it is the separation (abstraction) of information from direct computation thats the culprit argues Shankar. Every added step adds energy cost.

Nature, on the other hand, bakes information into things. Consider, said Shankar, how a string of amino acids folds into its intended 3-D conformation on a tiny energy budget and in a very short time just by interacting with its environment, and contrast that with the amount of compute required i.e. energy expended to accurately predict protein folding from a sequence of amino acids. Shankar, research technology manager at the SLAC National Laboratory and adjunct Stanford professor, argues computing must take a lesson from nature and strive to pack information more tightly into applications and compute infrastructure.

Information theory is a rich field with a history of rich debate. Turning theory into practice has often proven more difficult and messy. Shankar (and his colleagues) have been developing a formal framework for classifying the levels of information content in human-made computation schemes and natural systems in a way that permits direct comparison between the two. The resulting scale has eight classification levels (0-7).

Theres a lot to digest in Shankars talk. Rather than going off the rails here with a garbled explanation its worth noting that Argonne has archived the video and Shankar has a far-along paper thats expected in a couple of months. No doubt some of his ideas will stir conversation. Given that Argonne will be home to Aurora, the exascale supercomputer now being built at the lab, it was an appropriate site for a talk on the future of computing.

Before jumping into what the future may hold, heres a quick summary of Shankars two driving points 1) Moores law, or more properly the architecture and semiconductor technology on which it rests, is limited and 2) the growing absolute energy cost of information processing using traditional methods (von Neumann) are limiting:

A big part of the answer to question of how computing must progress, suggested Shankar, is to take a page from Feynmans reverberating idea not just for quantum computing and emulate the way nature computes, pack[ing] all of the information needed for the computing into the things themselves or at least by reducing abstraction as much as possible.

Argonne assembled an expert panel to bat Shankars ideas around. The panel included moderator Rick Stevens (associate laboratory director and Argonne distinguished fellow), Salman Habib (director, Argonne computational science division and Argonne distinguished fellow), Yanjing Li (assistant professor, department of computer science, University of Chicago), and Fangfang Xia (computer scientist, data science and learning division, ANL).

Few quibbled with the high-energy cost of computing as described by Shankar but they had a variety of perspectives on moving forward. One of the more intriguing comments came from Xia, an expert in neuromorphic computing. He suggested using neuromorphic systems to discover new algorithms is a potentially productive approach.

My answer goes back to the earlier point Sadas and Rick made which is, if were throwing away efficiency in the information power conversion process, why dont we stay with biological system for a bit longer. Theres this interesting field called synthetic biological intelligence. They are trying to do these brain-computer interfaces, not in a Neurolink way, because thats still shrouded in uncertainty. But there is a company and they grow these brain cells in a petri dish. Then they connect this to an Atari Pong game. And you can see that after just 10 minutes, these brain cells self-organize into neural networks, and they can learn to play the game, said Xia.

Keep in mind, this is 10 minutes in real life, its not a simulation time. Its only dozens of games, just like how we pick up games. So this data efficiency is enormous. What I find particularly fascinating about this is that in this experiment there was no optimization goal. There is no loss function you have to tweak. The system, when connected in this closed loop fashion, will just learn in an embodied way. That opens so many possibilities, you think about all these dishes, just consuming glucose, you can have them to learn latent representations, maybe to be used in digital models.

Li, a computer architecture expert, noted that general purpose computing infrastructure has existed for a long time.

I remember this is the same architecture of processor design I learned at school, and I still teach the same materials today. For the most part, when were trying to understand how CPUs work, and even some of the GPUs, those have been around for a long time. I dont think there has been a lot of very revolutionary kind of changes for those architectures. Theres a reason for that, because we have developed, good tool chains, the compiler tool change people are educated to understand and program and build those systems. So anytime we want to make a big change [it has] to be competitive and as usable as what we know of today, Li said.

On balance, she expects more incremental changes. I think its not going to be just a big jump and well get there tomorrow. We have to build on small steps looking at building on existing understanding and also evolving along with the application requirements. I do think that there will be places where we can increase energy efficiency. If were looking at the memory hierarchy, for example, we know caches and that it helps us with performance. But its also super inefficient from an energy performance standpoint. But this has worked for a long time, because traditional applications have good locality, but we are increasingly seeing new applications where [there] may not be as many localities so theres a way for innovation in the memory hierarchy path. For example, we can design different memory, kind of reference patterns and infrastructures or applications that do not activate locality, for example. That will be one way of making the whole computing system much more efficient.

Li noted the trend toward specialized computing was another promising approach: If we use a general-purpose computing system like a CPU, theres overhead that goes into fetching the instructions, decoding them. All of those are overheads are not directly solving the problem, but its just what you need to get the generality you need to solve all problems. Increasing specialization towards offloading different specialized tasks would be another kind of interesting perspective of approaching this problem.

There was an interesting exchange between Shankar and Stevens over the large amount of energy consumed in training todays large natural language processing models.

Shankar said, Im quoting from literature on deep neural networks or any of these image recognition networks. They scale quadratically with the number of data points. One of the latest things that is being hyped about in the last few weeks is a trillion parameter, natural language processing [model]. So here are the numbers. To train one of those models, it takes the energy equivalent to four cars being driven a whole year, just to train the model, including the manufacturing cost of the car. That is how much energy is spent in the training on this, so there is a real problem, right?

Not so fast countered Stevens. Consider using the same numbers for how much energy is going into Bitcoin, right? So the estimate is maybe something like 5 percent of global energy production. At least these neural network models are useful. Theyre not just used for natural language processing. You can use it for distilling knowledge. You can use them for imaging and so forth. I want to shift gears a little bit. Governments around the world and VCs are putting a lot of money into quantum computing, and based on what you were talking about, its not clear to me that thats actually the right thing we should be doing. We have lots of opportunities for alternative computing models, alternative architectures that could open up spaces that we know in principle can work. We have classical systems that can do this, he said.

Today, theres an army of computational scientists around the world seeking ways to advance computing, some of them focused on the energy aspect of the problem, others focused on other areas such on performance or capacity. It will be interesting to see if the framework and methodology embodied on Shankars forthcoming paper not only provokes discussion but also provides a concrete methodology for comparing computing system efficiency.

Link to ANL video: https://vimeo.com/event/2081535/17d0367863

Brief Shankar Bio

Sadasivan (Sadas) Shankar is Research Technology Manager at SLAC National Laboratory and Adjunct Professor in Stanford Materials Science and Engineering. He is also an Associate in the Department of Physics in Harvard Faculty of Arts and Sciences, and was the first Margaret and Will Hearst Visiting Lecturer in Harvard University and the first Distinguished Scientist in Residence at the Harvard Institute of Applied Computational Sciences. He has co-instructed classes related to materials, computing, and sustainability and was awarded Harvard University Teaching Excellence Award. He is involved in research in materials, chemistry, and specialized AI methods for complex problems in physical and natural sciences, and new frameworks for studying computing. He is a co-founder and the Chief Scientist in Material Alchemy, a last mile translational and independent venture for sustainable design of materials.

Dr. Shankar was a Senior Fellow in UCLA-IPAM during a program on Machine Learning and Many-body Physics, invited speaker in The Camille and Henry Dreyfus Foundation on application of Machine Learning for chemistry and materials, Carnegie Science Foundation panelist for Brain and Computing, National Academies speaker on Revolutions in Manufacturing through Mathematics, invited to White House event for Materials Genome, Visiting Lecturer in Kavli Institute of Theoretical Physics in UC-SB, and the first Intel Distinguished Lecturer in Caltech and MIT. He has given several colloquia and lectures in universities all over the world. Dr. Shankar also worked in the semiconductor industry in the areas of materials, reliability, processing, manufacturing, and is a co-inventor in over twenty patent filings. His work was also featured in the journal Science and as a TED talk.

Go here to read the rest:
ANL Special Colloquium on The Future of Computing - HPCwire