DIGITAL TECHNOLOGIES IN COGNITIVE PRODUCTION PLANTS
“SPIRE-06” is (presently) an informal discussion group of the projects funded under the H2020 DT-SPIRE-06-2019 topic focusing on digital technologies for improved performance in cognitive production plants, and that includes representatives from the CAPRI, COGNIPLANT, COGNITWIN, FACTLOG, HyperCOG and INEVITABLE projects. The group is exploring ways in which the projects can collaborate, share information and organise joint activities that are of mutual interest and for the community of interest/practice on cognitive production plants. It is expected that, in a near future, the group with become a formal cluster between these projects.
Increasing the efficiency of a biomass-powered lime kiln
The COGNIPLANT approach uses three layers. The first involves sensing and data virtualisation, connecting and structuring the data from sensors and the production equipment. The second layer is where data analytics happens, using data mining and processing of big data to extract information about the processes and the overall performance of the plant. Finally, the “Co-Decide” layer is where the digital twin exists, which is the heart of the COGNIPLANT project. This makes it possible to simulate the operation of the plants and then improve performance.
Fornaci Calce Grigoli, a company in north-eastern Italy that produces quicklime, is one of the four pilot plants of COGNIPLANT. They have three lime kilns at their facility, but COGNIPLANT is focusing on the biggest of the three, which is a twin shaft parallel flow regenerative kiln. In these types of kilns, two inter-connected vertical shafts are alternately fired to achieve excellent energy efficiency. The kiln is powered by biomass (sawdust), and the process is slow, with quicklime extracted around 30-40 hours after limestone is added to the kiln.
At the start of the project, the team identified ways to improve the production process. An improvement of lime quality is one such step, especially dealing with the larger size lime that has a higher chance of containing residual carbon dioxide that can affect the quality of the product. Increasing the efficiency of combustion through improved monitoring is another way to improve the process, as well as reducing the number of kiln stops by using predictive maintenance. All three of these ways for improving the production process were translated into KPIs that are monitored via sensors. The digital twin is then fed this data, which enables better prediction of lime quality.
A cognitive automation platform for asphalt production
The process industry in the EU is facing a shortage of raw materials, increasing energy prices and environmental constraints that require an improvement in adaptability, flexibility and automation processes. The CAPRI project is working towards the digital transformation of the process industry by developing cognitive solutions. It is developing and testing a cognitive automation platform (CAP), and transforming traditional process factories into cognitive plants by integrating cognitive tools for all industrial automation levels.
The CAP is modular and scalable, so that advanced applications can be developed and integrated on top of it, and replicated for other SPIRE sectors. It includes a toolbox of cognitive solutions for planning, operation, control and sensing. Overall, the CAPRI system is being tested in three use case sites covering the asphalt, steel, and pharmaceutical industries.
In the asphalt use case, a number of cognitive solutions are being implemented, covering planning of production, predictive maintenance of the baghouse, control of the asphalt drum, a sensor for bitumen content in the recycled asphalt, and a sensor for measuring the amount of filler present in the cold aggregates.
Predicting ladle degradation in the steelmaking process
Nenad Stojanovic of Nissatech explained how they are using their system to predict ladle degradation in the harsh conditions of steelmaking. The process industry is characterised by dynamic situations which require the prediction of the future behaviour of the system, even though the system is constantly changing. A common problem is the equipment degradation – there is a need for the ability to predict the status of a tool attached to a machine while it is being used. In the steel industry, the harsh conditions, high dimensional spaces, and various other unknown phenomena make this a particularly big problem.
Sidenor is a market leader in the European special steel long product industry as well as an important supplier of cold finished products in the European market. It is acting as a use case in several of the projects within this cluster. COGNITWIN has been working with them to help them predict how the lining of the ladle, used to hold molten metal in the steel production process, changes during the dynamic processes that happen during production. Normally, this evaluation is done manually using visual observations or by measuring the thickness of refractory bricks used in the lining, but COGNITWIN aims to standardise this decision using objective data. This will be done by creating a digital twin of the ladle to model its behaviour. This will provide predictions of the thickness of the bricks after certain amounts of heat, give a recommendation of whether the ladle should be used again, and predict how many more uses the ladle can take.
Monitoring and optimisation of the electric arc furnace process
At present, there are few process measurements taken in the EAF process due to the difficulty of taking measurements in such high temperatures, which means that some important process values cannot be measured continuously or at all. As such, most operators currently rely on indirect measurements and intuition through experience when actuating the EAF.
The idea behind the project is that existing production sites can be retrofitted with the supervisory control systems to improve performance by increasing process efficiency, equipment reliability, productivity and many other KPIs while at the same time reducing greenhouse gas emissions. The systems will use first principle modelling, advanced data analytics, and data mining to create digital twins, soft sensors, decision support systems and more.
Existing sensor systems in the use case site will be used for data acquisition. This data will then be transferred from the process databases into the digital environments of the project, (the Siemens Edge and Siemens Mindsphere), where it can be used in advanced digitals solutions for process improvement. There will be solutions and tools for operational data analysis, as well as others for the monitoring, simulation, and optimisation of the EAF process.
Theoretical models will provide operators with estimates of unmeasured process values and allow them to simulate different operational scenarios and examine their influence on performance. It is possible that these approaches can in the future be implemented to other industrial processes that exhibit similar characteristics, such as wastewater treatment plants and cell growth for biological medicines.
Optimisation of the steelmaking process through a hyperconnected and cognitive cyber-physical architecture
Coordinator Beatriz Chicote began her talk by introducing HyperCOG’s innovative node architecture, which enables every aspect of the manufacturing process to interact and communicate with each other in real time to aid intelligent decision making and cognitive systems and to provide interconnectivity and interoperability. HyperCOG has developed 11 node types so far, which are designed to optimise the plant’s scheduled production process by automating how on-line production planning problems caused by unexpected events are solved, a job currently done by humans at the plant.
The current steel-making process at Sidenor is carried out in several steps, starting with the raw material (scrap), through an electric arc furnace, then into a ladle before secondary metallurgy and grading, casting and then product formation. Planning the sequences of these stages is currently done offline by humans and determined by available heat. Changing from one sequence to another leads to downtime where production is stopped. HyperCOG’s node architecture is now being used to demonstrate how this can be optimised, solving both offline and online production planning problems. The idea is to create a digital twin of the process to model effective interventions in real time.
A monitoring tool is being developed at the pilot scheme at SIDENOR which is being used to configure and verify the HyperCOG architecture. The tool allows the partners to monitor the state of the nodes, the communication between them and the data being collected. This enables the configuration of the nodes along the process and the creation of new nodes where necessary. Dynamic modelling of the process is underway and algorithms are being developed.
Cognitive digital twins in an oil refinery use-case
Cognitive digital twins in an oil refinery use-case
The FACTLOG project is developing the concept of a cognitive factory as an ensemble of independent but connected enhanced cognitive twins (ECTs), that can self-learn, effectively detect and react to anomalies and disruptions, enjoy a local and global view of operations, and are capable of reasoning for optimisation. By connecting various digital twins with different optimisation approaches, it enhances the overall cognition capabilities.
In an oil refinery, crude oil comes into large debutanisers, then flows through pipes to deethanisers, and eventually the refined product ends up in a final LPG (liquid petroleum gas) tank. Faults or changes in different parts of the process can lead to different levels of quality at the end. To help return the process to an optimal state, factors such as temperatures and pressures can be fine-tuned. At any given time, the FACTLOG system is looking to identifying non optimal production and work out how to return it to optimal production using the controls available to it.
The cognitive digital twins monitor the production processes, with all product performance indicators based on energy consumption and productivity. They then capture and predict events or anomalies based on product qualities and energy consumption, and quantify the impact of those events on the current plan, providing a time dependent estimation of the impact of the event on product tanks and pattern of anomalies. Finally, they provide alternatives for handling the events and evaluate these alternatives, making recommendations to alleviate this impact on the final or intermediate production tanks, globally optimising processes to improve energy efficiency and productivity.