In shape optimization, a widely used method pioneered by Hadamard involves applying the classical gradient descent approach to identify a (possibly local) optimum. At each iteration, the shape is updated via a small perturbation of the identity. A key challenge lies in computing the shape gradient, particularly when the cost function depends on the solution of a partial differential equation (PDE). This computation is inherently approximate, with two main approaches: "discretize-then-optimize" and "optimize-then-discretize".
The "discretize-then-optimize" approach applies the gradient method to the discretized cost function but may result in degenerate discretizations, artificially lowering the cost function or producing solutions dependent on the discretization refinement. Conversely, the "optimize-then-discretize" approach discretizes the gradient method itself, but it may lead to non-converging algorithms, particularly if stopping criteria are set too strictly. This talk focuses on the "optimize-then-discretize" framework, with an emphasis on cost functions dependent on solutions to the Laplace equation.
The "optimize-then-discretize" approach is not unique due to multiple equivalent expressions for the continuous shape gradient. According to the Hadamard structure theorem, the shape gradient can be expressed as a boundary integral depending only on the normal velocity of the convection field. This boundary-based expression is commonly used in academia. Alternatively, the distributed shape gradient—highlighted in Hiptmair et al. (2015)—offers improved precision in some cases, albeit at the expense of violating the Hadamard structure theorem, raising challenges such as the choice of convection field extension. Recent work by Gong et al. (2021) demonstrates that boundary-based gradients can achieve similar precision by reconstructing the flux on the boundary.
In this talk, we propose a novel discretized shape gradient formulation based on the mortar method. Like the boundary-based or flux-reconstructed approched, it satisfies the Hadamard structural Theorem. We also compare the precision of the different approach for cost functions potentially depending on the gradient of the primal solution. Several numerical example and optimization problem will be presented.
This work is a collaboration with Timothée Devictor and Marc Josien.
References :
- Hiptmair, R.; Paganini, A. & Sargheini, S. Comparison of approximate shape gradients BIT Numerical Mathematics, Springer, 2015, 55, 459-485
- Gong, W. & Zhu, S. On discrete shape gradients of boundary type for PDE-constrained shape optimization SIAM Journal on Numerical Analysis, SIAM, 2021, 59, 1510-1541