BANDEAU4_ESMC_copie_1.jpg

An open-source scalable architecture for finite elements with neural network based constitutive models (FENN)
Benjamin Alheit  1, 2@  , Mathias Peirlinck  2, *@  , Sid Kumar  1, *@  
1 : Dept. of Materials Science & Engineering Delft University of Technology
2 : Dept. of BioMechanical Engineering, Delft University of Technology
* : Corresponding author

Significant progress has been made in the development of neural networks (NNs) for the constitutive modeling of (anisotropic) hyperelastic solids, with wide-ranging applications ranging from biomechanics to soft robotics. Some examples of these include constitutive artificial neural networks (CANNs) [1], input convex neural networks (ICNNs) [2, 3], and neural ordinary differential equations (NODEs) [4]. These neural networks based constitutive models (NNCMs) are far more flexible than traditional constitutive models whilst maintaining important attributes such as objectivity, material symmetry, and polyconvexity. The ultimate goal of these constitutive models is to be deployed in a scalable finite element method (FEM) framework for analyzing large-scale mechanical boundary value problems.


We, however, demonstrate that the integration of NNCMs in a conventional FEM framework destroys their respective compute scalability due to mismatch in how NNCMs and FEM capitalize on hardware parallelization. To address this, we reimagine a scalable FEM architecture that takes synergistic advantage of the single instruction multiple data architecture of NN frameworks to maintain computational performance despite the additional floating-point operations that NNCMs introduce relative to traditional constitutive models. We demonstrate efficient compute-scaling of the framework with increasing size and complexity of mechanical boundary value problems. The architecture is realized in C++ using the deal.II finite element library [5] and torch deep learning library [6] and featuring distributed memory parallelism. With the launch of an open-source codebase of the algorithm, we invite the solid mechanics community to adopt our code, and finally start harnessing the power of NNCMS in their research and work.


References:
[1] Kevin Linka et al. “Constitutive artificial neural networks: A fast and general approach to predictive data-driven constitutive modeling by deep learning”. en. In: Journal of Computational Physics 429 (Mar. 2021), p. 110010. issn: 0021-9991. doi: 10.1016/j.jcp.2020.110010. url: http://dx. doi.org/10.1016/j.jcp.2020.110010.
[2] Prakash Thakolkaran et al. “NN-EUCLID: Deep-learning hyperelasticity without stress data”. en. In: Journal of the Mechanics and Physics of Solids 169 (Dec. 2022), p. 105076. issn: 0022-5096. doi: 10.1016/j.jmps.2022. 105076. url: http://dx.doi.org/10.1016/j.jmps.2022.105076.
[3] Faisal As'ad, Philip Avery, and Charbel Farhat. “A mechanics-informed artificial neural network approach in data-driven constitutive modeling”. en. In: International Journal for Numerical Methods in Engineering 123 (12 June 2022), pp. 2738–2759. issn: 1097-0207. doi: 10.1002/nme.6957. url: http://dx.doi.org/10.1002/nme.6957.
[4] Vahidullah Tac, Francisco Sahli Costabal, and Adrian B. Tepole. “Data-driven tissue mechanics with polyconvex neural ordinary differential equations”. en. In: Computer Methods in Applied Mechanics and Engineering 398 (Aug. 2022), p. 115248. issn: 0045-7825. doi: 10.1016/j.cma.2022. 115248. url: http://dx.doi.org/10.1016/j.cma.2022.115248.
[5] Daniel Arndt et al. “The deal.II Library, Version 9.5”. en. In: Journal of Numerical Mathematics 31 (3 Sept. 2023), pp. 231–246. issn: 1569-3953. doi: 10.1515/jnma- 2023- 0089. url: http://dx.doi.org/10.1515/jnma-2023-0089.
[6] Adam Paszke et al. “PyTorch: An Imperative Style, High-Performance Deep Learning Library”. Dec. 2019. doi: 10.48550/arXiv.1912.01703. eprint: 1912.01703. url: http://arxiv.org/abs/1912.01703.


Loading... Loading...