Humboldt-Universität zu Berlin - Mathematisch-Naturwissenschaftliche Fakultät - Institut für Mathematik

Vorstellungsvorträge JP Daniel Walter und JP Sven Wang

  • Wann 09.01.2024 ab 15:00 Uhr
  • Wo RUD 25, Humboldt-Kabinett
  • Name des Kontakts
  • iCal

Am nächsten Dienstag (09.01.2024) werden Sven Wang und Daniel Walter im Anschluss an die Institutsratssitzung (voraussichtlich ab 15:00 Uhr) im Humboldt-Kabinett kurze Vorstellungsvorträge (jeweils 20-25 Minuten) zu ihren Arbeitsgebieten halten.


On the relation of sparsity and convex geometry in nonsmooth optimization (D. Walter)

Abstract: The incorporation of convex but nonsmooth regularization functionals into infinite dimensional minimization problems has become a cornerstone of modern approaches to optimal control, variational regularization in inverse problems as well as machine learning. This is attributed to the observation that the correct choice of the minimization space as well the regularization functional promotes desired structural features in the obtained reconstructions. We point out, e.g., the staircasing effect in image denoising problems or super-resolution effects obtained by Radon-norm regularization. However, these desirable effects come at the cost of new difficulties, both, in the theoretical aspects of the arising minimization problems as well as their practical realization. In this short talk, we present a novel approach towards alleviating the numerical solution of such problems by exploiting the geometry of the generalized unit ball associated to the regularization functional.


The rich interplay of statistics and differential equations (S. Wang)

Abstract: Many complex systems in physics, engineering and other fields are described by partial differential equations (PDE) or stochastic differential equations (SDE). When noisy and incomplete data are collected from such systems, the reconstruction of unknown parameters or hidden states of the system poses significant statistical and algorithmic challenges. Using examples from inverse problems and from parameter estimation in SDEs, we will show some common methods for the solution of such problems, and discuss the theoretical guarantees they enjoy. Our focus will be on Bayesian methods.

On the other hand, (stochastic) differential equations are used to formulate powerful frameworks for generative modeling in machine learning. If time permits, we will also discuss a statistical framework of those models.

Interessenten sind herzlich eingeladen!