Getting costs and data volumes under control
A deep understanding of processes and systematic quality control are required to minimize waste. The welding processes are monitored with emission-based sensor technology, 3D imaging via triangulation, camera and OCT systems, and even computer tomography (CT) scans both on and off the production lines. Process monitoring generates vast amounts of data and results in high costs. Both are problematic, especially in regulated industries, as Christoph Hauck, Chief Technology and Sales Officer at toolcraft AG, reported. He formulated specific demands for AI: It should determine the ideal parameter configuration for predictive process planning in order to prevent errors. "Real-time error detection in the process, which minimizes the need for expensive CTs, would also be very helpful to the industry," he explained. For additive processes, he would like to see automatic parameter adjustment to new powder batches, geometries, and machine variants, adaptive real-time control, and first-time-right production so that all quality requirements can be met right away, even for a batch size of 1. And in order to implement load-optimized designs in powder bed processes, companies need process strategies that create perfect structures along the load paths and otherwise operate in speed mode to produce AM components faster and more cost-effectively.
The examples show that applications are becoming more complex, components more valuable, and the variety of processes in which AI is used to optimize processes based on data is increasing. At the same time, traceability is in demand in industrial applications. Hauck was not the only one to hint at how unsatisfactory the status quo is: companies generate large amounts of data, but too often this data ends up in silos. When it is used, the insights gained are often minimal. Causalities remain unclear; errors are assessed manually and subjectively. The effort involved in testing and process qualification is getting out of hand. AI comes in handy in this problematic situation.
Data strategy and cloud platforms
However, a common theme throughout all of the presentations was that adapting to the respective processes is a challenging, interdisciplinary task that requires care and strategic clarity. This begins with the IT infrastructure. Having their own data centers and storage hardware is hardly profitable or practical for companies. Cloud-based high-performance computing and cloud-based data platforms make process, sensor, machine, and test data usable where it is needed – in cloud-based simulations or on production-oriented edge computers. The coexistence of centralized and decentralized infrastructure and heterogeneous data formats must be managed strategically so that the added value hidden in raw data can be exploited. AI tools help to harmonize the raw data, thereby making it useful. "The more this generated data can actually be used for AI and machine learning models, the better and more realistic they become. It is a self-reinforcing process in which the data platform becomes the central driver of innovation in the company," said Kiene. But this process has not yet caught on in data silos.
Photonics is a case in point. It faces unique challenges: heterogeneous processes, expandable data availability, and process understanding limited to specialists. Thomas Koschke from BCT Steuerungs- und DV-Systeme and Max Zimmermann from Fraunhofer ILT described them vividly at the conference. In order to train AI for the parameterization and control of robotically supported laser metal deposition (LMD) processes, they had to create the data basis themselves. "For good LMD results, you have to set many parameters, which often interact with each other. If, for example, the feed rate and laser power are not matched, there is a risk of overheating or the powder not melting properly. Both of these factors affect quality," explained Koschke, referring to two of the many parameters. Since testing all variants for optimal parameterization is not practical, AI should support process setup, and then also inline process monitoring.
When it comes to the details, the exact time stamp counts
However, the road to series production readiness has been rocky. The AI needed to learn error detection and process control, and when this data was generated, the devil was in the details. To visualize the data, BCT software assigned images, temperature measurements, laser power, voltage, and other sensor data to the topology of the component. However, problems became apparent as soon as the data was processed into uniform formats, scales, and standards. Time stamps were incorrect, a misaligned nozzle distorted the data, and there were other inconsistencies. "It all comes down to the details when collecting data," Koschke warned. "You can't expect good prediction results with poor data," Zimmermann added.
There was another problem: Some anomalies were not caused by process errors, but by faulty sensor configurations. "Before the data model can be constructed using in-situ and ex-situ data from metallographic analyses and CT scans, all inconsistencies must be clarified," Zimmermann explained. However, once all data has been precisely assigned in terms of time and space, the AI model performs excellently. Because labeling is also time-consuming, the partners had AI and process experts work together on this. The latter then labeled a few images. The AI trained with this data then evaluated entire data sets, which the process experts corrected as necessary. Using a human-in-the-loop approach, the team created the necessary database to further develop the AI for series production.