Tensor-decomposition-based A Priori Surrogate (TAPS) modeling for ultra large-scale simulations

Date

2025-03-18

Department

Program

Citation of Original Publication

Rights

Attribution-NonCommercial-NoDerivatives 4.0 International

Subjects

Abstract

A data-free, predictive scientific AI model, Tensor-decomposition-based A Priori Surrogate (TAPS), is proposed for tackling ultra large-scale engineering simulations with significant speedup, memory savings, and storage gain. TAPS can effectively obtain surrogate models for high-dimensional parametric problems with equivalent zetta-scale (1021) degrees of freedom (DoFs). TAPS achieves this by directly obtaining reduced-order models through solving governing equations with multiple independent variables such as spatial coordinates, parameters, and time. The paper first introduces an AI-enhanced finite element-type interpolation function called convolution hierarchical deep-learning neural network (C-HiDeNN) with tensor decomposition (TD). Subsequently, the generalized space-parameter-time Galerkin weak form and the corresponding matrix form are derived. Through the choice of TAPS hyperparameters, an arbitrary convergence rate can be achieved. To show the capabilities of this framework, TAPS is then used to simulate a large-scale additive manufacturing process as an example and achieves around 1,370x speedup, 14.8x memory savings, and 955x storage gain compared to the finite difference method with 3.46 billion spatial degrees of freedom (DoFs). As a result, the TAPS framework opens a new avenue for many challenging ultra large-scale engineering problems, such as additive manufacturing and integrated circuit design, among others.