Advanced Optimization Algorithms


[Appointment] [Caveat] [Summary] [TOC]

Appointment

Lectures: Discussion:
Begin: April 20
Language: English


[Appointment] [Caveat] [Summary] [TOC]

Caveat

The course is now partly aimed at students in the research training group GK 1128 on "Analysis, Numerics, and Optimization of Multiphase Systems". Accordingly the content have been somewhat modified from the original announcement.


[Appointment] [Caveat] [Summary] [TOC]

Summary

We will focus on optimization tasks arising from the general calculus of variations problem: Minimize the integral of an energy density E(u,Du) over a displacement function u on a fixed domain.

When E is convex with respect to both of its arguments discretization yields a convex unconstrained minimization problem with a partially separable objective function. It can be solved by Newton, and variants of BFGS or conjugate gradients in combination with line-searches or trust regions. The linear algebra for the calculation and the updating by partitioned BFGS methods should make use of the partial separability and corresponding sparsity. Of particular interest are methods which work even if E is only once Lipschitz continuously differentiable, for example because it arises as the convex hull of a nonconvex energy density.

When E is convex with respect to the Jacobian Du but not the deformation u itself, the integral functional can still be convex but certain natural discretizations may not. This convexity gap is mostly of theoretical interest.

When E is not even convex with respect to Du, there may not be any solution in function space as the integral functional need not be weakly lower semi-continuous. Then E should be replaced by its so called quasi-convex envelop, which we will for simplicity identify with the convex envelop. The evaluation of a convex envelop for a once continuously differentiable function E can be performed by the combination of several global minimization runs over the given fixed domain and local minimizations of a mixed multiphase energy. The latter problem has a set of bilinear constraints and simple inequalities, which allow the application of reduced or projected gradient methods.

In the course we will consider these simple NLP solvers and global optimization schemes based on the branch and bound principle. For absolute certainty regarding global optimality one has to employ interval methods, which we will consider at the end.

References: Bonnans et al, Troeltzsch, Deuflhard, Nesterov and Nemirovski, Neumaier, Balakrishnan, Griewank, Griewank & Rabier, Grone et al, Dacaronga, Nocedal & Wright


[Appointment] [Caveat] [Summary] [TOC]

Table of contents (Preliminary)


[Appointment] [Caveat] [Summary] [TOC]

    Part I : Local Optimization

  1. Unconstrained Optimization (in Hilbert Space)
    1. Existence and Uniqueness of Minimizers.
    2. Line-Search and trust regions
    3. Global convergence of descent methods.
    4. Conjugate Gradient and BFGS
    5. Local Convergence Rates
  2. Partial Separability Structure
    1. Convex decompositions
    2. Partitioned Updating
    3. Chordal graph lemma
    4. Frontal decomposition
  3. Solving nonlinear Equations (in Banach Space)
    1. Newton-Kantorovich Theorem
    2. Mesh-Independence Result
    3. Broyden methods
    4. Adjoint quasi-Newton
  4. Simply Constrained Optimization
    1. Equality and sign constraints
    2. Karush-Kuhn-Tucker conditions
    3. Reduced gradient methods

    Part II: Global Optimization

  5. Convex hull problem
    1. Dual characterization
    2. Regularity properties
  6. Unconstrained global optimization
    1. Branch and bound
    2. Outer Relaxation
    3. Lower Bound Construction
  7. Verified Computing
    1. Interval propagation
    2. Interval Newton
    3. Box-Exclusion

Jan Riehme
Last modified: Thu Apr 21 12:44:18 CEST 2005