HiDeNN-TD: Reduced-order hierarchical deep learning neural networks
Loading...
Collections
Author/Creator
Author/Creator ORCID
Date
2022-02-01
Type of Work
Department
Program
Citation of Original Publication
Zhang, Lei, Ye Lu, Shaoqiang Tang, and Wing Kam Liu. “HiDeNN-TD: Reduced-Order Hierarchical Deep Learning Neural Networks.” Computer Methods in Applied Mechanics and Engineering 389 (February 1, 2022): 114414. https://doi.org/10.1016/j.cma.2021.114414.
Rights
This item is likely protected under Title 17 of the U.S. Copyright Law. Unless on a Creative Commons license, for uses protected by Copyright Law, contact the copyright holder or the author.
Subjects
Abstract
This paper presents a tensor decomposition (TD) based reduced-order model of the hierarchical deep-learning neural networks (HiDeNN). The proposed HiDeNN-TD method keeps advantages of both HiDeNN and TD methods. The automatic mesh adaptivity makes the HiDeNN-TD more accurate than the finite element method (FEM) and conventional proper generalized decomposition (PGD) and TD, using a fraction of the FEM degrees of freedom. This work focuses on the theoretical foundation of the method. Hence, the accuracy and convergence of the method have been studied theoretically and numerically, with a comparison to different methods, including FEM, PGD, TD, HiDeNN and Deep Neural Networks. In addition, we have theoretically shown that the PGD/TD converges to FEM at increasing modes, and the PGD/TD solution error is a summation of the mesh discretization error and the mode reduction error. The proposed HiDeNN-TD shows a high accuracy with orders of magnitude fewer degrees of freedom than FEM, and hence a high potential to achieve fast computations with a high level of accuracy for large-size engineering and scientific problems. As a trade-off between accuracy and efficiency, we propose a highly efficient solution strategy called HiDeNN-PGD. Although the solution is less accurate than HiDeNN-TD, HiDeNN-PGD still provides a higher accuracy than PGD/TD and FEM with only a small amount of additional cost to PGD.