Humboldt-Universität zu Berlin - Mathematisch-Naturwissenschaftliche Fakultät - Institut für Mathematik

Forschungsseminar Mathematische Statistik

Für den Bereich Statistik


A. Carpentier, S. Greven, W. Härdle, M. Reiß, V. Spokoiny

 

Ort

Weierstrass-Institut für Angewandte Analysis und Stochastik
HVP 11 a,  R.313 bitte beachten Sie die Raumänderung!!!
Mohrenstrasse 39
10117 Berlin

 

Zeit

mittwochs, 10.00 - 12.00 Uhr


Programm

 
16. Oktober 2024
Botond Szabo (Bocconi Milan)
Privacy constrained semiparametric inference
Abstract: For semi-parametric problems differential private estimators are typically constructed in a case-by-case basis. In this work we develop a privacy constrained semi-parametric plug-in approach, which can be used in general, over a collection of semi-parametric problems. We derive minimax lower and matching upper bounds for this approach and provide an adaptive procedure in case of irregular (atomic) functionals. Joint work with Lukas Steinberger (Vienna) and Thibault Randrianarisoa (Toronto, Vector Institute).
 
23. Oktober 2024
Weining Wang (University of Groningen)
Conditional Nonparametric Variable Screening Test via Neural Network Factor Regression
Abstract: We propose a conditional variable screening test for non-parametric regression.  To render our test effective when facing predictors with high or even diverging dimension, we assume that the observed predictors arise from a factor model where the factors are latent but low-dimensional.  Our test statistics are based on the estimated partial derivative of the regression function in the screening variable when conditioning on the extracted proxies for the factors.  Hence, our test reveals how much predictors contribute to non-parametric regression after accounting for the factors.  Our derivative estimator is the convolution of a deep neural network regression estimator and a smoothing kernel.  We demonstrate that when the neural network could scale up as the sample size grows, unlike estimating the regression function itself, it is important to smooth the partial derivative of the neural network estimator to recover the desired convergence rate for the derivative.  Moreover, our screening test achieves asymptotic normality under the null after finely centering our test statistics, as well as consistency for local alternatives under mild conditions.  We demonstrate the performance of our test in a simulation study and two real world applications.
30. Oktober 2024
Olga Klopp (ESSEC Business School, Paris) 
Adaptive density estimation under low-rank constraints
Abstract: In this talk, we address the challenge of bivariate probability density estimation under low-rank constraints for both discrete and continuous distributions. For discrete distributions, we model the target as a low-rank probability matrix. In the continuous case, we assume the density function is Lipschitz continuous over an unknown compact rectangular support and can be decomposed into a sum of K separable components, each represented as a product of two one-dimensional functions. We introduce an estimator that leverages these low-rank constraints, achieving significantly improved convergence rates. We also derive lower bounds for both discrete and continuous cases, demonstrating that our estimators achieve minimax optimal convergence rates within logarithmic factors.

06. November 2024

     Vladimir Spokoiny (WIAS und HUB)

       Regression estimation and inference in high dimension

Abstract: The talk discusses a general non-asymptotic and non-minimax approach to statistical     estimation and inference with applications to nonlinear regression. The main results provide finite-sample Fisher and Wilks expansions for the maximum-likelihood estimator with an explicit remainder in terms of the effective dimension of the problem.

 

13. November 2024 findet in Adlershof statt
Vladimir Spokoinys Geburtstagskolloquium (WIAS/ HU)
Prof. Spokoinys Geburtstagskolloquium anlässlich seines 65. Geburtstag
für weitere Information https://wias-berlin.de/workshops/Spokoiny2024/

20. November 2024

Bernhard Stankewitz (Universität Potsdam) 

 

Abstract: 

27. November 2024    
Julien Chhor (Toulouse School of Economics)
Abstract:

 

04. Dezember 2024    
N. N. ()

 

Abstract: 

 

11. Dezember 2024  
N. N. ()

    Abstract:

18. Dezember 2024   
 N. N. ()
Abstract: 
08. Januar 2025
N. N. ()
Abstract: 
15. Januar 2025
Xiaorui Zuo (NUS Singapore)
Cryptos have Rough Volatility and Correlated Jumps
Abstract: 
22. Januar 2025
Vincent Rivoirard (Université Dauphine, Paris)

 

Abstract:

 

29. Januar 2025

Davy Paindavene ()

 

Abstract:

 

05. Februar 2025

Sophie Langer (University of Twente)

 

Abstract:

 

12. Februar 2025

Judith Rousseau (University of Oxford / Paris Dauphine- PSL University)

 

Abstract:

 


 Interessenten sind herzlich eingeladen.

Für Rückfragen wenden Sie sich bitte an:

Frau Marina Filatova

Mail: marina.filatova@hu-berlin.de
Telefon: +49-30-2093-45460
Humboldt-Universität zu Berlin
Institut für Mathematik
Unter den Linden 6
10099 Berlin, Germany