Humboldt-Universität zu Berlin - Mathematisch-Naturwissenschaftliche Fakultät - Institut für Mathematik

FS Mathematische Statistik

Bereich für Stochastik


M. REIß, V. SPOKOINY, W. HÄRDLE

 


 
Ort:
Weierstrass-Institut für Angewandte Analysis und Stochastik,
Erhard-Schmidt-Raum, Mohrenstrasse 39, 10117 Berlin
Zeit:
mittwochs, 10.00 - 12.30 Uhr

23. Oktober 2013
Dominique Bontemps (Université Paul Sabatier)
Bayesian posterior consistency and contraction rates in the Shape Invariant Model

Abstract:
In this work, we consider the so-called Shape Invariant Model which stands for the estimation of a function f0 submitted to a random translation of law g0 in a white noise model. We are interested in such a model when the law of the deformations is unknown. We aim to recover the law of the process P(f0,g0) as well as f0 and g0. In this perspective, we adopt a Bayesian point of view and find prior on f and g such that the posterior distribution concentrates around P(f0,g0) at a polynomial rate when n goes to infinity. We obtain a logarithmic posterior contraction rate for the shape f0 and the distribution g0. We also derive logarithmic lower bounds for the estimation of f0 and g0 in a frequentist paradigm.

30. Oktober 2013
Denis Belomestny (Universität Essen)
On one inverse problem of financial mathematics with error in the operator

Abstract:
In this talk we consider a calibration problem for the so called Markov Functional Models (MFM) from the statistical point of view. It is shown that at each step of the calibration procedure one has to solve a nonlinear inverse problem with the operator estimated in the previous step. We propose a regularisation method and derive the optimal convergence rates.

06. November 2013
Stéphane Gaïffas (Ecole Polytechnique)
Link Prediction in Graphs with time-evolving Features

Abstract:
We consider the problem of link prediction in time-evolving graphs. We assume that certain graph features, such as the node degree, follow a vector autoregressive (VAR) model and we propose to use this information to improve the accuracy of prediction. Our strategy involves a joint optimization procedure over the space of adjacency matrices and VAR matrices. On the adjacency matrix it takes into account both sparsity and low rank properties and on the VAR it encodes the sparsity. The analysis involves oracle inequalities that illustrate the trade-offs in the choice of smoothing parameters when modeling the joint effect of sparsity and low rank property. The estimate is computed efficiently using proximal methods through a generalized forward-backward agorithm, and evaluated through numerical experiments. The first part of the talk will be a general presentation of the problem and of our results, the second part will describe some technical tools used in this work.

13. November 2013
Mathieu Sart (Université Sophia Antipolis, Nizza)
Estimation of the transition density of a Markov chain
20. November 2013
Victor-Emmanuel Brunel (ENSAE/ParisTech, France)
Adaptive estimation of convex polytopes

Abstract:
We consider a sample of $n$ i.i.d. random variables, uniformly distributed in some unknown polytope $P$ in $\R^d$. We propose a maximum likelihood estimator whose accuracy, defined as the expectation of the Lebesgue measure of its symmetric difference with the true polytope, is at most of the order $r(\ln n)/n$, when $r$, the number of vertices of $P$, is known. Using a concentration inequality for this estimator, we develop a procedure in order to estimate $P$ adaptively with respect to $r$, based on a model selection method. The adaptive estimator achieves the same rate as for the case of known $r$.

27. November 2013
Marc Hoffmann (Universités Paris Dauphine)
Statistical estimation of a growth-fragmentation model observed on a genealogical tree

Abstract:
We model the growth of a cell population by a piecewise deterministic Markov branching tree. Each cell splits into two offsprings at a division rate $B(x)$ that depends on its size $x$. The size of each cell grows exponentially in time, at a rate that varies for each individual. We show that the mean empirical measure of the model satisfies a growth-fragmentation type equation if structured in both size and growth rate as state variables. We construct a nonparametric estimator of the division rate B(x) based on the observation of the population over different sampling schemes of size n on the genealogical tree. Our estimator nearly achieves the rate $n^{-s/(2s+1)}$ in squared-loss error asymptotically. When the growth rate is assumed to be identical for every cell, we retrieve the classical growth-fragmentation model and our estimator improves on the rate $n^{-s/(2s+3)}$ obtained in a related framework through indirect observation schemes. Our method is consistently tested numerically and implemented on Escherichia coli data.

04. Dezember 2013
Jia Li (Duke University Durham)
tba
11. Dezember 2013
Matt Wand (University of Technology Sydney)
Variational approximations in statistics I

Abstract:
Variational approximations facilitate approximate inference for the parameters in complex statistical models and provide fast, deterministic alternatives to Monte Carlo methods. However, much of the contemporary literature on variational approximations is in Computer Science rather than Statistics, and uses terminology, notation, and examples from the former field. In this series of lectures we explain variational approximation in statistical terms. In particular, we illustrate the ideas of variational approximation using examples that are familiar to statisticians.

18. Dezember 2013
Matt Wand (University of Technology Sydney)
Variational approximations in statistics II

Abstract:
Variational approximations facilitate approximate inference for the parameters in complex statistical models and provide fast, deterministic alternatives to Monte Carlo methods. However, much of the contemporary literature on variational approximations is in Computer Science rather than Statistics, and uses terminology, notation, and examples from the former field. In this series of lectures we explain variational approximation in statistical terms. In particular, we illustrate the ideas of variational approximation using examples that are familiar to statisticians.

08. Januar 2014
N.N.
15. Januar 2014       N.N.

 

22. Januar 2014
Thomas Kneib (Universität Göttingen)
tba
29. Januar 2014
Axel Bücher (Ruhr-Universität Bochum)
tba
05. Februar 2014       N.N.
12. Februar 2014
Bo Markussen (University of Copenhagen)
tba
 

Interessenten sind herzlich eingeladen.

 

 

Link zum Berliner Kolloquium für Wahrscheinlichkeitstheorie

 

 


 

Für Rückfragen wenden Sie sich bitte an:


Frau Andrea Fiebig,
Humboldt-Universität zu Berlin, Institut für Mathematik,

Unter den Linden 6, 10099 Berlin, Germany
fiebig@mathematik.hu-berlin.de


Telefon +49(30)2093 5860 Fax +49(30)2093 5848