Heavy industries, particularly cement, steel and chemicals, are the top greenhouse gas emitting industries, contributing 25% of global CO2 emission. They use high temperature heat in many of their processes that is primarily driven by fossil fuel. Fighting climate change requires lowering heavy industry emissions. However, these industries face tremendous challenges to reduce greenhouse gas emissions. Replacing equipment is not a viable route to reduce emissions, as these industries are capital intensive, with asset lifecycles of over 40 years. They are also trying alternate fuels, which come with their own challenges of alternate fuel availability, and the ability to manage processes with fuel-mixes. The Paris Agreement on climate change also mandates that these industries will need to reduce annual emissions by 12-16% by 2030. Generative AI, when applied to industrial processes, can improve production yield, reduce quality variability and lower specific energy consumption (thereby reducing operational costs and emissions).
Higher variability in processes and operations results in higher specific energy consumption (SEC) and higher emissions. This variability comes from material inconsistency (raw material comes from earth), varying weather conditions, machine conditions and the human inability to operate the processes at top efficiency 24 hours a day, every day of the week. Artificial Intelligence technology can predict future variability in the processes and the resultant impact on yield, quality and energy consumption. For example, say we predict the quality of the clinker in advance, then we are able to optimize the heat energy and combustion in the cement kiln in such a way that quality clinker is produced at minimum energy. Such optimization of the processes reduces energy consumption and in turn reduces both energy emission and process emission.
Foundation models make AI more scalable by consolidating the cost and effort of model training by up to 70%. The most common use of foundation models is in natural-language processing (NLP) applications. However, when adapted accordingly, foundation models enable organizations to successfully model complex industrial processes accurately, creating a digital twin of the process. These digital twins capture multivariate relationships between process variables, material characteristics, energy requirements, weather conditions, operator actions, and product quality. With these digital twins, we can simulate complex operating conditions to get accurate operating set points for process “sweet spots.” For example, the cement kiln digital twin would recommend the optimal fuel, air, kiln speed and feed that minimizes heat energy consumption and still produces the right quality of clinker. When these optimized set points are applied to the process, we see efficiency improvements and energy reductions that have not been seen or realized before. The improved efficiency and SEC not only translate to EBITDA value, but also reduced energy emission and process emission.
Optimize industrial production with Foundation Models
Heavy industry has been optimizing processes with AI models for the last few years. Typically, regression models are used to capture process behavior; each regression model captures the behavior of a part of the process. When stitched together with an optimizer this group of models represents the overall behavior of the process. These groups of 10-20 models are orchestrated by an optimizer like an orchestra to generate optimized operating point recommendations for plants. However, this approach could not capture the process dynamics, such as ramp-ups, ramp-downs especially during disruptions. And training and maintaining dozens of regression models is not easy, making it a bottleneck for accelerated scaling.
Today, foundation models are used mostly in natural language processing. They use the transformer architecture to capture longer term relationships between words (tokens in Gen AI terminology) in a body of text. These relationships are encoded as vectors. These relationship vectors are then used to generate content for any specific context (say, a rental agreement). The accuracy of resultant content generated from these mapped vectors is impressive, as demonstrated by ChatGPT. What if we could represent time series data as a sequence of tokens? What if we can use the parallelized transformer architecture to encode multivariate time series data to capture long and short-term relationships between variables?
IBM Research, in collaboration with IBM Consulting, has adapted the transformer architecture for Time Series data and found promising results. Using this technology, we can model an entire industrial process, say a cement kiln with just one foundation model. The foundation models are trained for a process domain and can capture the behavior of the entire asset and process class. For instance, a cement mill foundation model can capture the behavior of several capacities of cement mills. Therefore, every subsequent mill that we deploy to needs to go through only finetuning of the “Cement Mill Foundation Model” rather than a top-down training process. This cuts model training and deployment time by half, making it a viable technology for large-scale rollouts. We have observed that these foundation models are 7 times as accurate as regression models. And to top it all, we can capture process dynamics as these models do multi-variate forecasting with good accuracy.
Generative AI powered future of heavy industry
Generative AI technology is bound to transform industrial production to an unforeseen level. This is the solution to reign in industrial emissions and increase productivity with minimal CAPEX impact and positive EBITDA impact. IBM is engaging with several clients to bring this technology to the production floor and seeing up to a 5% increase in productivity and up to 4% reduction in specific energy consumption and emissions. We form a joint innovation team along with the client teams and together train and deploy these models for several use cases ranging from supply chain optimization, production optimization, asset optimization, quality optimization to planning optimization. We have started deploying this technology in a large steel plant in India, a cement plant in Latin America and CPG manufacturing in North America.
Ultimately, it’s about people: the operators in the plant must embrace it, the process engineers should love it, and the plant management must value it. That can only be achieved with effective collaboration and change management, which we focus on throughout the engagement. Let’s partner together on fostering in an era where we can grow our production capacities without compromising on the sustainability ambitions and create a better, healthier world for future generations to come.
Source: ibm.com
0 comments:
Post a Comment