SIM-AM 2025

Multi-Camera Fault Detection in Fused Deposition Modelling

  • Kilambi, Shanthalakshmi (TU Delft)
  • Tournoy, Aster (TU Delft)
  • Amani, Muhamad (TU Delft)
  • Jovanova, Jovana (TU Delft)
  • Caglar, Baris (TU Delft)
  • Masania, Kunal (TU Delft)

Please login to view abstract download link

Fused Deposition Modelling (FDM) is one of the most popular 3D printing technologies due to its affordability and accessibility. However, FDM frequently experiences printing errors that waste time, materials, and energy. These defects are especially common among inexperienced users, resulting in a failure rate of around 20%. The high failure rate can be attributed to the open-loop nature of 3D printing. Various methodologies have been explored to detect and correct common mistakes in FDM. These methods include analysing image data, temperature, acoustic emissions and electrical resistance. Among these approaches, image data analysis has proven to be particularly effective, with studies employing either single or multi-camera setups. Despite these advancements, several gaps remain in the current methodologies. Many studies tend to focus on specific defect types, such as delamination, warping, or extrusion errors and lack a comprehensive system covering all common error types. Additionally, many existing detection systems are not designed for in situ use, limiting their ability to detect printing errors in real time. We propose a fault detection system that utilises two side cameras and a nozzle camera, making it easily accessible to any 3D printer user. It aims to identify a wide range of errors in situ by employing two YOLOv8 classification networks that classify images captured during printing into common 3D printing errors. One network is trained to detect errors from the nozzle camera, while the other identifies errors from the side cameras. The nozzle and side camera networks achieve overall accuracies of 97.7% and 97.6%, respectively. High prediction accuracy is achieved using these networks trained on over 60,000 meticulously hand-labelled images. The three-camera system detects 12 types of 3D printing errors, surpassing the capabilities of state-of-the-art optical approaches. This is made possible by leveraging the complementary strengths of different camera perspectives. The nozzle-mounted view is sensitive to defects related to extrusion anomalies, while the side views facilitate the detection of defects like delamination and warping. Combining the classifications from these high-accuracy models, our unified system robustly detects and classifies the most common 3D printing errors in situ and in real time.