
Design of Additive Manufacturing Processes: AI Based Real-Time Power Prediction Models
Please login to view abstract download link
Additive Manufacturing (AM) processes are widely used to create complex three-dimensional products by depositing material layer by layer. This approach contrasts significantly with subtractive manufacturing processes, where material is removed to shape the final product. In wire-arc directed energy deposition (wa-DED), a generative scheme is employed to build parts from the base up using power supplied by a welding arc. When optimizing wa-DED processes, it is crucial to predict welding power, deposition speed, and thermal history to enable accurate modeling of the process. These parameters are typically predicted based on material properties, part geometry, deposition method, and the limitations of the printing machine. Traditionally, experimental methods, numerical simulations, physics-based models, and empirical approaches have been used to predict optimal process parameters for AM processes. However, in recent years, data science techniques have also been employed for the design, optimization, and control of AM processes. These techniques enable the development of powerful prediction and control schemes based on existing data. In this research work, AI-based data models are developed to predict welding power for multi-layer wa-DED wall structures. These models utilize databases generated from validated numerical simulation results. These real-time data models are based on a combination of data solvers and interpolation technologies, which disentangle data trends in multi-dimensional search spaces. Initially, a snapshot matrix of process scenarios for variations in welding power at different layers was defined, and verified finite element (FE) simulations were performed to create a small database. This small database was then used to create the initial data model, utilizing the best combination of data solvers and interpolators. After proper validation of the data model, normalized errors were calculated, and a second generation of the data model was created based on corrective schemes for the calculated errors. The new data model was then validated and used to create a generative database.