Artificial Intelligence: Advancing Applications in the CPI – ChemEngOnline

A convergence of digital technologies and data science means that industrial AI is gaining ground and producing results for CPI users

As data accessibility and analysis capabilities have rapidly advanced in recent years, new digital platforms driven by artificial intelligence (AI) and machine learning (ML) are increasingly finding practical applications in industry.

Data are so readily available now. Several years ago, we didnt have the manipulation capability, the broad platform or cloud capacity to really work with large volumes of data. Weve got that now, so that has been huge in making AI more practical, says Paige Morse, industry marketing director for chemicals at Aspen Technology, Inc. (Bedford, Mass.; http://www.aspentech.com). While AI and ML have been part of the digitalization discussion for many years, these technologies have not seen a great deal of practical application in the chemical process industries (CPI) until relatively recently, says Don Mack, global alliance manager at Siemens Industry, Inc. (Alpharetta, Ga.; http://www.industry.usa.siemens.com). In order for AI to work correctly, it needs data. Control systems and historians in chemical plants have a lot of data available, but in many cases, those data have just been sitting dormant, not really being put to good use. However, new digitalization tools enable us to address some use cases for AI that until recently just werent possible.

This convergence of technologies, from smart sensors to high-performance computing and cloud storage, along with advances in data science, deep learning and access to free and open-source software, have enabled the field of industrial AI to move beyond pure research to practical applications with business benefits, says Samvith Rao, chemical and petroleum industry manager at MathWorks (Natick, Mass.; http://www.mathworks.com). Such business benefits are wide-ranging, spanning varying realms from maintenance to materials science to emerging applications like supply-chain logistics and augmented-reality (AR). MathWorks recently collaborated with a Shell petroleum refinery to use AI to automatically incorporate tagged equipment information into operators AR headsets. All equipment in the refinery is tagged with a unique code. Shell wished to extract this data from the images acquired from cameras in the field. First, image recognition and computer-vision algorithms were applied, followed by deep-learning models for object detection to perform optical character-recognition. Ultimately, equipment meta-data was projected onto AR headsets of operators in the field, explains Rao.

Another major emerging area for industrial AI is in supply-chain management. The application of AI in supply chains lets us look at different scenarios and consider alternatives. Feedback from data about whats actually occurred, including any surprising events, can be put into the model to develop better scenario options that appropriately reflect reality, says Morse.

With the wide variety of end-use applications and ever-expanding platform capabilities, determining the most streamlined way to adopt an AI-based platform into an existing process can seem daunting, but Colin Parris, senior vice president and chief technology officer at GE Digital (San Ramon, Calif.; http://www.ge.com/digital), classifies industrial AI into three discrete pillars that build upon each other to deliver value early warning of problems, continuous prediction of problems and dynamic optimization. Data are, of course, paramount in realizing all three pillars. For early warning, I have sensors to give the state of the plant, showing the anomalies when that state changes. Continuous prediction looks at condition-based data to avoid unplanned shutdowns. Here, I want to know the condition of the ball bearings, the corrosion in the pipes, understand the creep in the machines in order to then determine the best plan and not default to time-based maintenance, so I need a lot of data. And if I want to do optimization, I need even more data, says Parris. All of the data can be culminated into a digital twin, which Parris defines as a living, learning model that is continuously updated to give an exact view of an asset (Figure 1). He emphasizes that model complexity is not a given. I may be able to use a surrogate model, which is a slimmed-down version that doesnt need to know all the process nuances. I may only need to know about certain critical parts. The model will constantly use data and update itself to live.

FIGURE 1. A robust digital twin may look at an entire plant, or might be a slimmed-down model that considers only certain critical parts. The model should use AI to continuously update itself

GE Digital worked with Wacker Chemie AG (Munich, Germany; http://www.wacker.com) to apply a holistic AI hierarchy for asset-performance management (APM) at a polysilicon production plant in Tennessee. There are roughly 1,500 pressure vessels at the site, and maintenance on them takes six weeks, resulting in significant financial burden due to lost production time. Regulatory compliance meant that these vessels were supposed to be maintained every two years. But, because we were able to actually capture the digital twin and show the current state of the asset, we helped the plant achieve API 580/581 certification, which says if a plant can show a certain level of condition-based capability, they can extend the maintenance interval anywhere from 5 to 10 years based upon the condition, explains Parris. With the early-warning and continuous-prediction pieces in place, the plant was experiencing improved availability and less downtime, and was able to begin looking at dynamic optimization. For Wacker, this included investigating specific product characteristics and intelligently adjusting the processes for higher-margin products. Thats the way it tends to work you go in a stepwise fashion to ultimately get to optimization, but its really hard to get to the optimization piece unless you first really understand the asset and have a digital twin that you know is learning as you make changes, adds Parris.

Furthermore, when implementing an advanced AI solution in a new or existing process, users must consider how the platform will be used and who will actually be using it. In the past, black-box AI solutions required users with some expertise in data science or advanced statistics, which often resulted in organizational data siloes, says Allison Buenemann, industry principal chemicals, at Seeq Corp. (Seattle, Wash.; http://www.seeq.com). Now, the industry has more self-service offerings in the advanced analytics and ML space, meaning that users in many different roles can access the most relevant data and insights for their own unique job needs. For instance, front-line process experts can hit the ground running, solving problems using complex algorithms from an unintimidating low- or no-code application experience. Executives and management teams can expect an empowered workforce solving high-value business problems with rapid time to insight, adds Buenemann. This democratization of data analytics and ML across organizations means that all stakeholders can work together to drive business value. Users must be able to document the thought process behind an analysis and they also must be able to structure analysis results for easy consumption, she explains.

The massive growth in sensor volume and associated data availability have certainly helped to promote the applicability of AI in industrial environments, but computing power and network connectivity are also critical pieces of the puzzle. Yokogawa Electric Corp. (Tokyo, Japan; http://www.yokogawa.com) recently announced a proof-of-concept project to utilize fifth-generation (5G) mobile communications for AI-enabled process controllers. The project will focus on using 5G to remotely control the level in a network of water tanks. One of the major benefits of 5G connectivity in autonomous, realtime plant control, according to Hirotsugu Gotou, manager, Yokogawa products control center, is its low-latency function, which means that the network can process a large volume of data with minimal delay. Yokogawas cloud-based AI controller system employs reinforcement-learning technology to determine the optimal operation parameters for a particular control loop.

Understanding reinforcement-learning schemes, which build upon modern predictive control, is crucial for autonomous process control. Reinforcement learning is a type of machine learning in which a computer learns to perform a task through repeated trial-and-error interactions with a dynamic environment, explains Mathworks Samvith Rao. Such a platform develops control policy in real time by interacting with the process, enabling the computer to make a series of decisions that maximize a reward metric for the task without human intervention and without being explicitly programmed to achieve the task. Robust mechanisms for safe operation of a fully trained model, and indeed, for safe operation of a plant, are high priorities for further investigation, he emphasizes.

In Yokogawas reinforcement-learning proof-of-concept, the AI controls tank level and continuously receives sensor data on flowrate and level. Based on these data, the AI will learn about the operation and will repeat the process to derive the optimal operation parameters, explains Gotou. Yokogawa previously completed a demonstration project using its proprietary AI to control water level in a three-tank system (Figure 2), which showed that after 30 iterations of learning (taking less than 4 h), the AI agent was able learn from its past decisions to determine the optimal control methods. Now, the company will work with mobile network provider NTT Domoco to construct a demonstration facility for cloud-based remote control of water tank level and evaluate the communication performance of the 5G network for realtime, autonomous process control. 5G networks are not yet widely adopted in industrial settings, but other projects are also exploring these technologies for IIoT applications. In April, GE Research announced an initiative to test Verizons 5G platform in a variety of industrial applications, including realtime control of wind farms. And last year, Accenture and AT&T began a 5G proof-of-concept project to develop 5G use cases for IIoT applications at a petroleum refinery in Louisiana operated by Phillips 66.

FIGURE 2. This demonstration unit includes a three-tank network in which an autonomous, reinforcement-learning-based scheme monitors and controls water level

Another important factor is the collaborative environment that has been fostered through open-source AI platforms, explains Gino Hernandez, head of global digital business for ABB Energy Industries (Zurich, Switzerland; http://www.abb.com). As things become more open and more distributed, I think its going to enable users to apply the technologies in a more meaningful way. The more people talk about the different models and their successes using open-source type AI models, and being able to have platforms where they can import and run those models is going to be key, he notes. In the past, vendors kept their platforms closed, which limited users to develop models only for a specific digital architecture. Now, says Hernandez, more AI platforms enable users to import models including their own proprietary algorithms from various sources to develop a more robust analytics program. Some users have rich domain expertise and want to build their own platforms. They are looking for environments where they not only have the ability to potentially use vendor-developed algorithms, but also use their own algorithms and have a sandbox in which they can import their own models and begin to integrate them, he explains.

As with any digital technology, cybersecurity and protecting proprietary intellectual property (IP) are paramount, but Hernandez also brings up the idea of sharable IP as a major area of opportunity for industrial AI. We see a lot of open sharing with users looking at different models related to machinery health in the open-source space. There are definite advantages for companies being open to sharing machinery-health data in multi-tenant cloud environments, because it helps us as an industry to better capture, understand and very quickly identify when there are systemic problems within pumps, sensors, PLCs or other elements, continues Hernandez. He also believes that the industry is becoming more comfortable with the ability to securely lock certain components of proprietary data within a platform, but still be able to share other selections of more generic data within a cloud environment. Facilitating and expediting this collaborative conversation will be key in accelerating the adoption and evolution of predictive machinery-health monitoring, which is among the more mature use cases for industrial AI, notes Hernandez.

One of the most prominent uses for industrial AI continues to be predictive maintenance. Everybodys looking at how to get more throughput, and the easiest way to do that is to reduce your downtime with predictive maintenance, explains Clayton French, product owner Digital Enterprise Labs at Siemens Industry. Siemens has worked with Sinopec Groups (Beijing, China; http://www.sinopecgroup.com) Qingdao Refinery using AI to investigate critical rotating-equipment components and predict potential causes of downtime. We took six months of data and did a feasibility study, which found that eighteen hours before compressor failure, they would have been notified that the asset was having a problem, potentially saving around $300,000, says French. In another project, French notes that Siemens conducted a feasibility study in which AI was able to detect an equipment failure almost a month in advance. Such models integrate correlation analysis, pattern recognition, health monitoring and risk assessment, among others.

Furthermore, when an anomaly is detected, and a countermeasure is initiated in the plant to fix the problem, the AI can record the instance in its database. Then, the next time it senses that a similar failure is about to occur, the AI will recommend a similar countermeasure, which can reduce maintenance time in the long term. This shows that the AI is learning and taking in all of these inputs. It continues to get better after its initial implementation, adds French. He emphasizes, however, that users should practice prudence in applying AI: Not everything turns out to be worthwhile in some cases, the AI can only predict something a few minutes before it happens, so you cant do anything actionable. Our studies point out what is actionable so that users can target the most effective things to monitor.

TrendMiner N.V. (Hasselt, Belgium; http://www.trendminer.com) recently introduced its custom-built Anomaly Detection Model using ML optimized for learning normal operating conditions and detecting deviations on new incoming data, which ultimately helps to avoid sub-optimal process operation or downtime by allowing users to react at the advent of anomaly ahead of productivity losses or equipment malfunctions, explains TrendMiner director of products Nick Van Damme. The ML model interfaces with TrendMiners self-service time-series analytics platform, by collecting sensor data readings over a user-defined historical timeframe of the process or equipment being analyzed. Process and asset experts further prepare the data by leveraging built-in search and contextualization capabilities to filter out irrelevant data to confirm that the view is an accurate representation of normal, desired operating conditions.This prepared view is then used to train the Anomaly Detection Model to learn the desired process conditions by considering the unique relationships between the sensors. This will allow detection of anomalies on other historical data, and more importantly, on new incoming data for a process. The trained model will return whether a new datapoint is an outlier or not based on a given threshold and return an anomaly score. The higher the anomaly score, the more likely that the datapoint is an outlier, adds Van Damme. In a batch process use-case, the model was trained to recognize a good batch profile and use that as a benchmark to alert users of deviations. A dashboard (Figure 3) provides visualizations of the operating zones learned by the model, with the latest process data points overlaid (shown in orange in Figure 3). Such a visualization enables users to quickly evaluate current process conditions versus normal operating behavior.

FIGURE 3. A visualization engine driven by ML develops a dashboard where current incoming data can be quickly benchmarked against established operating conditions

Another maintenance example from AspenTech involves fouling in ethylene furnaces. Typically, an operator will do periodic cleanouts of coke buildup on the furnaces, but what would be better is to get a better indication of when you actually need to do a cleanout, versus just scheduling it. So what companies are doing is taking the relevant furnace operating data and being able to predict fouling to prevent unplanned downtime. Users can be sure they are cleaning out the furnace before a real operational issue occurs, notes Morse.

On the optimization side, she highlights a case where AspenTech helped a polyethylene producer to streamline transitions between product grades to maximize production value. As catalysts are changed out to accommodate different production slates, there is a transition period where the resulting product is an off-grade material. The customer was able to apply an AI hybrid-model concept to look at how reactors are actually performing, and was able to decrease the amount of transition, both in terms of volume throughput, so they werent wasting feedstock making a product they didnt want, but also by narrowing that transition time, they were also spending more reactor time making the preferred product instead of transition-grade material.

Rockwell Automation, Inc. (Milwaukee, Wis.; http://www.rockwell.com) has also done extensive work using AI to optimize catalyst yield and product selectivity in traditional polymerization processes, as well. We started using pure neural networks to try to learn polymer reaction coefficients. We lean more and more into the actual reaction kinetics and the material balance around the reactors, trying to control the polymer chain length in the reactor. This is how you can get a specific property, such as melt flow or a melt index, on a polymer, says Pal Roach, oil and gas industry consultant at Rockwell Automation. In a particular example involving Chevron Phillips, an AI-driven advanced control model was applied to cut transition times between polymer grades by four hours. This change also led to a 50% reduction in product variability. In another case involving a distillation unit for long-chain alcohols, an AI-driven scheme applied to a nonlinear controller helped to cut energy consumption by around 35% and significantly reduce product-quality variability, as well as associated waste. There are going to be more and more of these types of AI applications coming as the industry refocuses and transitions into greener energy and more environmental safety and governance consciousness, predicts Roach.

Beyond predictive maintenance, companies are also starting to use AI to translate business targets (such as financial, quality or environmental goals) into process-improvement actions. Maintenance is key, because when youre shut down, youre not making any product and youre losing money. So, once you address that problem, the next question asks how can we run even better? Then you can start looking at process optimization, says Mack. The main problem for the optimization, especially for complex production lines, is the correlation of the process variables with which the operators are confronted, combined with the high numbers of DCS alarms that couldnt be evaluated. This issue is addressed by business impact driven anomaly detection. In the past, when operators would adjust setpoints for process variables, it would be loosely tied into business objectives, such as product quality. Now, process data can be aligned with specific business targets using AI. Anomalies we might detect in the data could be affecting quality or throughput. Then, using AI, users can categorize and rank these anomalies and their impact on business goals. The end result is that the process control system, as it sees these issues occurring, will prioritize them based on the business objectives of the company, he says, adding that such an AI engine could similarly be tied to a companys sustainability goals.

As chemical manufacturers are increasingly looking toward more sustainable feedstock options, bio-based processes, such as fermentation, are reaching larger scales and necessitating more precise and predictive control. We have used AI on corn-to-bioethanol fermentation optimization and seen yield increases from 2 to 5%, so that means youre getting more alcohol from the same amount of corn. And weve also seen overall production capacity increases as high as 20%, says Michael Tay, advanced analytics product manager at Rockwell Automation. To build the AI model for fermentation, Tay explained that Rockwell began with classic biofermentation modeling tuning the Michaelis-Menten equations, which predict the enzymatic rate of reaction, as the fundamental architecture. This enabled realtime control of the temperature profile in the fermenter. You try to keep temperatures high, but then as alcohol concentration increases, you have to cool the reactor more so that the yeast gets more life out of it, because as the alcohol concentration goes up, the yeast performance goes down. The AI is showing dynamic recognition and adaptation of the fermentation profiles, so thats sort of the key to those yield improvements. But youre also getting more alcohol out of every batch, he adds. In addition to temperature-driven optimization, Rockwell has also used AI to improve the enzyme-dosing step in biofermentation processes. If you have this causally correct model that is based on biological fundamentals, driven by data and AI, then you can optimize your batch yield to ultimately get more out of the yeast, which is your catalyst in the reactor, says Tay. AspenTech is also working on developing accurate AI and simulation models for bio-based processes like fermentation, as well as looking at advanced chemical recycling models. Were tuning those processes to be more efficient, and were approaching predictability, but the feedstock variance will be something that we will be working on constantly, adds Paige Morse.

While AI and other digital tools have historically targeted operational and financial objectives, many chemical companies are increasingly looking at process metrics that specifically consider environmental initiatives, such as reducing emissions and waste. Seeq worked with a CPI company to deploy an automated model of a sulfur oxides (SOx) detectors behavior during the time periods when its range was exceeded. Typically, accurate emissions reporting becomes more challenging when vent-stack analyzers peg out at their limits, necessitating complex, manual calculations and modeling. Seeqs model development required event identification to isolate the data set for the time periods before, during and after a detector range exceedance occurred. Regression models were fit to the data before and after the exceedance, and then extrapolated forward and backward to generate a continuous modeled signal, which is used to calculate the maximum concentration of pollutant, says Buenemann. The solution also compiled relevant visualizations into a single auto-updating report displaying data for the most recent exceedance event alongside visualizations tracking year-to-date progression toward permit limits, which enabled the company to make proactive process adjustments based on the SOx emissions trajectory.

AI plays a major role in reducing waste by helping to ensure product quality, explains Mathworks Rao, citing the example of Japanese films manufacturer Dexerials, which deployed an AI program for realtime detection of product defects. A deep-learning-based machine-vision system extracts the properties of product defects, such as color, shape and texture, from images, and classifies according to the type of defects. The system was put in place to improve upon the manual inspection system, which was an error-ridden process with low accuracy. The AI system not only improved the accuracy, but also greatly reduced product and feedstock waste and frequent production stoppage.

Beyond improving day-to-day industrial operations, AI and ML technologies are also enabling advances in the synthesis of new materials and product formulations. In developing ML-powered digital technologies that encompass the chemical knowledge for synthetic processes and materials formulation, IBM (Armonk, N.Y.; http://www.ibm.com) took inspiration from sources very far removed from chemistry image processing and language translation. We learned that some of the technologies that have been developed for image processing were actually applicable in the context of materials formulation, so we took those concepts and brought them into the chemical space, allowing us to reduce the dimensionality of chemical problems, explains Teo Laino, distinguished researcher at IBM Research Europe. IBM is partnering with Evonik Industries AG (Essen, Germany; http://www.evonik.com) to apply such a scheme to aid in optimizing polymer formulations. Quite often, when companies are working on formulating materials, such as polymers, the amount of data is relatively sparse compared to the dimensionality of the problem. The use of technologies that reduce the size of the problem means that there are fewer degrees of freedom, which are easier to match with available data. This is optimal, because users can make good use of data and can really see sensible benefits, he adds. Typically, optimizing a material to meet specific property requirements could take months, but IBMs platform for this inverse design process can significantly decrease that time, he says.

In designing a cognitive solution for chemical synthesis, IBM trained digital architectures that are normally used for translating between languages to create a digital solution that can optimize synthetic routes for molecules (Figure 4). By starting with technologies typically used for natural language processing, we recast the problem of predicting the chemical reactivity between molecules as a translation problem between different languages, explains Laino. Notably, the ML scheme has been validated in a large number of test cases, since IBM first made the platform (IBM RXN for chemistry, rxn.res.ibm.com) freely available online in 2018.This is one of the most complicated tasks in the materials industry today, and it is where ML can help to greatly speed up the design process. You can reduce the number of tests and trials and go more directly to the domain of the material formulation that is going to satisfy your requirements, says Laino.

FIGURE 4. AI can be used to quickly determine synthetic routes for new molecules

We built a community of more than 25,000 users that have been using the models almost 4 million times. You can use our digital models for driving and programming commercial automation hardware, and you can run chemical synthesis from home wherever you have a web browser. Its a fantastic way of providing a work-from-home environment, even for experimental chemists, says Laino. IBM calls this technology IBM RoboRXN (Figure 5) and is using its ML synthesis capabilities for in-house research related to designing novel materials for atmospheric carbon-capture applications. IBMs ML platform has also been adopted by Diamond Light Source (Oxfordshire, U.K.; http://www.diamond.ac.uk), the U.K.s national synchrotron science facility, to operate their fully autonomous chemical laboratory. They are coupling their own automated lab hardware with IBMs predictive platform to drive their chemical-synthesis experiments, adds Laino.

Some of IBMs other notable projects include its ten-year relationship with the Cleveland Clinic for deployment of AI for advancing life sciences research and drug design chemistry; and a collaboration with Mitsui Chemicals, Inc. (Tokyo; http://www.mitsuichemicals.com) to develop an AI-empowered blockchain platform promoting plastics circularity and materials traceability.

FIGURE 5. Open-source AI platforms enable experiments to be run remotely, bringing a new level of autonomous operations into chemistry laboratories

AI and ML are also proving to be effective technologies for accelerating the product-development cycle. Dow Polyurethanes (Midland, Mich.; http://www.dow.com/polyurethanes) and Microsoft collaborated to create the Predictive Intelligence platform product formulation and development. The platform harnesses materials-science data captured from decades of formulations and experimental trials and applies AI and ML to rapidly develop optimal product formulations for customers, explains Alan Robinson, North America commercial vice president, Dow Polyurethanes. Predictive Intelligence allows us to not only discover the chemistry and what a formulation needs to look like, but now we can also look at how we simulate trials. In the past, wed be running numerous trials that take place over a period as long as 18 months, and now we can do that with a couple clicks of a button, says Robinson.

The demands of end-use polyurethane applications mean that finding the best chemistry for a particular product can be quite complex. In a typical year were releasing hundreds of new products, and in a typical formulation, there might be a dozen components that are individually mixed at different levels in different orders. We also have to think of all of the different tooling and equipment that the materials will be subject to, as well as the kinetics that have to be played out. So, the challenge was how to take all the kinetics, rheology and formulation data and create a system that could move us forward, explains Dave Parrillo, vice president R&D, Dow Industrial Intermediates & Infrastructure.

To build such a complex platform, Dow relied on theory-based neural networks that incorporated critical correlations for kinetics and rheology. In a typical neural network, you feed it lots of data, which it learns from, and behind the scenes, its tuning its knobs and weighing different influences. We can now influence those knobs with theoretical correlations so that the system not only learns, but gets smarter over time, and also starts to explore spaces where we might not have as much data. It folds theoretical, empirical, semi-empirical, and experimental information into a single tool, says Parrillo. One of the first major applications that Dow is trialing for the platform is polyurethane mattresses, with multiple applications to follow in 2022.

Customers might be looking at a number of parameter constraints, from hardness and density, to airflow and viscoelastic recovery. Weve actually asked the AI engine to give us a series of formulations and then benchmark those formulations in the laboratory, and the accuracy is extraordinarily high, emphasizes Parrillo. The Predictive Intelligence platform will be available to customers beginning later this year.

FIGURE 6. AI can be used to rapidly and accurately validate pharmaceutical products for defects, which reduces manual inspection requirements

Once a product formulation is developed and manufacturing has begun, inspection and validation are key. Stevanato Group (Padua, Italy; http://www.stevanatogroup.com) recently launched an AI platform focused on visual inspection of biopharmaceutical products, looking at both particle and cosmetic defects (Figure 6). AI can improve overall inspection performance in terms of detection rate and false rejection rate. AI can help to reduce false rejects and costly interventions to parameterize the machine during production, explains Raffaele Pace, engineering vice president of operations at Stevanato Group. Recently, trials of the automatic inspection platform have produced promising results, including the ability to reduce falsely rejected products tenfold, with up to 99.9% accuracy, using deep learning (DL) techniques. Unlike traditional rule-based systems, DL models can generalize their predictions and be more flexible regarding variations, adds Pace. He also mentions that such advanced inspection performance can help to reduce the number of gray items, which are flagged on the production line but not rejected outright. Typically, such items require manual re-inspection, which adds time to the process. This helps the entire process become more lean and have less waste, while maintaining and improving quality, he continues. The company is currently working to enhance detection accuracy for both liquid and lyophilized products, and also developing an initiative to create pre-trained neural networks that could then adapt to specific defects and drugs. Producing such models will entail training the system with thousands of images, notes Pace.

Mary Page Bailey

View post:
Artificial Intelligence: Advancing Applications in the CPI - ChemEngOnline

Related Posts

Comments are closed.