Skip to Main content Skip to Navigation
Theses

Contributions to the theoretical study of variational inference and robustness

Abstract : This PhD thesis deals with variational inference and robustness. More precisely, it focuses on the statistical properties of variational approximations and the design of efficient algorithms for computing them in an online fashion, and investigates Maximum Mean Discrepancy based estimators as learning rules that are robust to model misspecification.In recent years, variational inference has been extensively studied from the computational viewpoint, but only little attention has been put in the literature towards theoretical properties of variational approximations until very recently. In this thesis, we investigate the consistency of variational approximations in various statistical models and the conditions that ensure the consistency of variational approximations. In particular, we tackle the special case of mixture models and deep neural networks. We also justify in theory the use of the ELBO maximization strategy, a model selection criterion that is widely used in the Variational Bayes community and is known to work well in practice.Moreover, Bayesian inference provides an attractive online-learning framework to analyze sequential data, and offers generalization guarantees which hold even under model mismatch and with adversaries. Unfortunately, exact Bayesian inference is rarely feasible in practice and approximation methods are usually employed, but do such methods preserve the generalization properties of Bayesian inference? In this thesis, we show that this is indeed the case for some variational inference algorithms. We propose new online, tempered variational algorithms and derive their generalization bounds. Our theoretical result relies on the convexity of the variational objective, but we argue that our result should hold more generally and present empirical evidence in support of this. Our work presents theoretical justifications in favor of online algorithms that rely on approximate Bayesian methods. Another point that is addressed in this thesis is the design of a universal estimation procedure. This question is of major interest, in particular because it leads to robust estimators, a very hot topic in statistics and machine learning. We tackle the problem of universal estimation using a minimum distance estimator based on the Maximum Mean Discrepancy. We show that the estimator is robust to both dependence and to the presence of outliers in the dataset. We also highlight the connections that may exist with minimum distance estimators using L2-distance. Finally, we provide a theoretical study of the stochastic gradient descent algorithm used to compute the estimator, and we support our findings with numerical simulations. We also propose a Bayesian version of our estimator, that we study from both a theoretical and a computational points of view.
Complete list of metadata

Cited literature [298 references]  Display  Hide  Download

https://tel.archives-ouvertes.fr/tel-02893465
Contributor : ABES STAR :  Contact
Submitted on : Wednesday, July 8, 2020 - 12:08:09 PM
Last modification on : Saturday, June 25, 2022 - 11:05:25 AM
Long-term archiving on: : Monday, November 30, 2020 - 2:39:51 PM

File

94726_CHERIEF-ABDELLATIF_2020_...
Version validated by the jury (STAR)

Identifiers

  • HAL Id : tel-02893465, version 1

Collections

Citation

Badr-Eddine Cherief-Abdellatif. Contributions to the theoretical study of variational inference and robustness. Statistics [math.ST]. Institut Polytechnique de Paris, 2020. English. ⟨NNT : 2020IPPAG001⟩. ⟨tel-02893465⟩

Share

Metrics

Record views

335

Files downloads

731