When E is convex with respect to both of its arguments discretization yields a convex unconstrained minimization problem with a partially separable objective function. It can be solved by Newton, and variants of BFGS or conjugate gradients in combination with line-searches or trust regions. The linear algebra for the calculation and the updating by partitioned BFGS methods should make use of the partial separability and corresponding sparsity. Of particular interest are methods which work even if E is only once Lipschitz continuously differentiable, for example because it arises as the convex hull of a nonconvex energy density.
When E is convex with respect to the Jacobian Du but not the deformation u itself, the integral functional can still be convex but certain natural discretizations may not. This convexity gap is mostly of theoretical interest.
When E is not even convex with respect to Du, there may not be any solution in function space as the integral functional need not be weakly lower semi-continuous. Then E should be replaced by its so called quasi-convex envelop, which we will for simplicity identify with the convex envelop. The evaluation of a convex envelop for a once continuously differentiable function E can be performed by the combination of several global minimization runs over the given fixed domain and local minimizations of a mixed multiphase energy. The latter problem has a set of bilinear constraints and simple inequalities, which allow the application of reduced or projected gradient methods.
In the course we will consider these simple NLP solvers and
global optimization schemes based on the branch and bound principle.
For absolute certainty regarding global optimality one has to employ
interval methods, which we will consider at the end.
References: Bonnans et al, Troeltzsch, Deuflhard, Nesterov and Nemirovski, Neumaier, Balakrishnan, Griewank, Griewank & Rabier, Grone et al, Dacaronga, Nocedal & Wright
Part I : Local Optimization
Part II: Global Optimization