Session: 01-05-02: AI-Driven Modeling and Simulation for Aerospace Structures
Paper Number: 187303
187303 - Physics-Guided Neural Solver Architecture Design for Scalable Nonlinear Structural Responses Prediction
In recent years, machine learning approaches have been actively explored as alternatives to conventional numerical solvers. While such models can achieve high accuracy during training, their performance often depends heavily on the availability and distribution of labeled data, limiting their generalizability across different problem settings, resolutions, or loading conditions. To improve physical consistency, physics-informed learning methods have been developed. These approaches primarily enforce physical laws over the solution domain as additional loss terms, yet the underlying network architectures typically remain black-box in nature. This physics-embedding approach is non-intrusive, making implementation with any off-the-shelf neural architecture easy. However, their effectiveness can deteriorate when physical knowledge is unknown, incomplete, or difficult to access, and when the physics-based loss is unavailable, leading the models to rely solely on data without much insight into physics itself.
From a solver perspective, predicting nonlinear structural responses governed by partial differential equations is not merely a direct mapping from inputs to outputs, but a progressive error-correction process driven by residuals across multiple spatial scales. Classical multigrid numerical methods exploit this structure by decomposing errors by frequency and resolving them hierarchically, leading to efficient and robust convergence. This perspective indicates that physics insight can be embedded directly into the solver architecture by organizing the learning process around residual evaluation and multiscale correction, rather than being imposed only through an explicit loss term on the solution. Building on this viewpoint, this study proposes a Hierarchical Laplacian Convolutional Neural Solver (HL-CNS). This approach is intrusive, indicating that neural architecture (and kernels) needs to be specifically designed and cannot be directly used with existing off-the-shelf neural networks. But it offers the capability to learn even when physics is only partially known or unknown, such as a material’s nonlinear constitutive relationship in structural analysis.
At each resolution level of the proposed HL-CNS framework, the structural response is updated via residual evaluation and correction, with the underlying differential operator implicitly represented by a learnable Laplacian convolution. Instead of performing repeated iterative smoothing, HL-CNS employs residual-driven relaxation and correction mechanisms to propagate information across spatial scales within a hierarchical structure. By embedding multiscale error correction directly into the network architecture, the solver captures essential characteristics of classical multigrid methods while retaining the flexibility of data-driven learning. As a result, inference is carried out using a single fixed-depth hierarchical update, avoiding iterative refinement during prediction and enabling efficient and scalable evaluation.
Numerical studies of nonlinear thermal diffusion and elastoplastic deformation demonstrate that HL-CNS consistently outperforms conventional data-driven neural architectures, including ResNet, fully convolutional networks, U-Net, and DeepONet, under comparable model sizes. Across all considered benchmarks, HL-CNS achieves the best or near-best prediction accuracy with compact and low-variance error distributions, indicating robust generalization across problems with different dimensions and underlying physics. In contrast, the benchmark models exhibit strongly task-dependent behavior and tend to degrade when the operator type or output structure changes. These results suggest that embedding solver-inspired hierarchical structures into neural architectures yields more robust and reliable performance than black-box off-the-shelf neural networks for structural response prediction.
Presenting Author: Xuandong Lu Arizona State University
Presenting Author Biography: Xuandong Lu received his master’s degree in civil engineering from Central South University, Changsha, China, in 2023. He is now a Ph.D. student at Arizona State University, Tempe, AZ, USA. His research interests include high-dimensional dynamics modelling, physics-guided structural response prediction, uncertainty quantification and propagation, structural health monitoring and damage identification.
Physics-Guided Neural Solver Architecture Design for Scalable Nonlinear Structural Responses Prediction
Paper Type
Technical Presentation Only
