BANDEAU4_ESMC_copie_1.jpg

Implicit automatic differentiation and implicit layers for constitutive modeling
Jérémy Bleyer  1@  
1 : Laboratoire Navier
Navier, Ecole des Ponts, Univ Gustave Eiffel, CNRS, Marne-la-Vallée, France

The intersection of computational mechanics and machine learning is advancing rapidly, particularly in the domain of constitutive modeling. Constitutive models, being inherently data-driven, share structural similarities with neural network layers. Beyond data-driven learning, there is significant potential in adapting traditional constitutive models for seamless integration with machine learning frameworks.

Automatic differentiation (AD) has revolutionized machine learning by enabling the computation of derivatives for complex functions constructed from simpler operations. In this talk, we will explore how AD can enhance constitutive modeling, focusing on two critical applications: efficient tangent operator computation and material parameter sensitivity.

First, AD can automate the derivation of consistent tangent operators, which are crucial for the efficient integration of constitutive models into large-scale simulations. We will examine strategies for obtaining these operators, including directly applying AD to unrolled algorithms and leveraging implicit differentiation for greater efficiency. Additionally, AD enables the computation of material parameter sensitivities, facilitating more effective model calibration and optimization.

Finally, we will discuss a novel architecture for data-driven constitutive models that employs implicit layers. This approach departs from conventional large-scale feed-forward neural networks, aiming to develop machine-learning-based constitutive models that adhere to thermodynamic principles, integrate domain knowledge, and demonstrate superior performance on unseen data. These models not only align with established theoretical frameworks but are also easier to train and generalize effectively.


Loading... Loading...