[Statlist] Research Seminar in Statistics | *FRIDAY 22 SEPTEMBER 2023* | GSEM, University of Geneva

gsem-support-instituts g@em-@upport-|n@t|tut@ @end|ng |rom un|ge@ch
Tue Sep 19 14:57:26 CEST 2023


Dear all,

We are pleased to invite you to our next Research Seminar, organized by Professor Sebastian Engelke on behalf of the Research Center for Statistics
< https://www.unige.ch/gsem/en/research/institutes/rcs/team/ >.

FRIDAY 22 SEPTEMBER 2023, at 11:15 am, Uni Mail M 5220 & Online
Zoom research webinar: https://unige.zoom.us/j/92924332087?pwd=U1U1NFk4dTFCRHBMeWYrSDBQcXBiQT09
Meeting ID: 929 2433 2087
Passcode: 399192

Marginally Calibrated Deep Distributional Regression
Nadja KLEIN, TU Dortmund University, Germany
< https://www.tu-dortmund.de/en/university/newly-appointed-professors/prof-nadja-klein/?tabindex=1&cHash=86de4c608b376e02c4ea40944475dfee >

ABSTRACT:
Deep neural network (DNN) regression models are widely used in applications requiring state-of-the-art predictive accuracy. However, until recently there has been little work on accurate uncertainty quantification for predictions from such models. We add to this literature by outlining an approach to constructing predictive distributions that are `marginally calibrated'. This is where the long run average of the predictive distributions of the response variable matches the observed empirical margin. Our approach considers a DNN regression with a conditionally Gaussian prior for the final layer weights, from which an implicit copula process on the feature space is extracted. This copula process is combined with a non-parametrically estimated marginal distribution for the response. The end result is a scalable distributional DNN regression method with marginally calibrated predictions, and our work complements existing methods for probability calibration.

The approach is first illustrated using the application of end-to-end learning in autonomous driving. Then, we show the usefulness of our approach in likelihood-free inference, where distributional deep regression is used to estimate marginal posterior distributions. In a complex ecological time series example, we employ the implicit copulas of convolutional networks, and show that marginal calibration results in improved uncertainty quantification.  Our approach also avoids the need for manual specification of summary statistics, a requirement that is burdensome for users and typical of competing likelihood-free inference methods.

REFERENCES:
Nadja Klein, David J. Nott, Michael Stanley Smith: Marginally Calibrated Deep Distributional Regression. J. Comput. Graph. Stat. 30(2): 467-483 (2021)
https://www.tandfonline.com/doi/abs/10.1080/10618600.2020.1807996?journalCode=ucgs20

Clara Hoffmann. Nadja Klein. "Marginally calibrated response distributions for end-to-end learning in autonomous driving." Ann. Appl. Stat. 17 (2) 1740 - 1763, June 2023. https://doi.org/10.1214/22-AOAS1693


> View the Research Seminar agenda: < https://www.unige.ch/gsem/en/research/seminars/rcs/ >

Regards,


Marie-Madeleine

Marie-Madeleine Novo
Assistant to the Research Institutes
gsem-support-instituts using unige.ch



More information about the Statlist mailing list