Simpler PAC-Bayesian Bounds for Hostile Data - École polytechnique Accéder directement au contenu
Pré-Publication, Document De Travail Année : 2016

Simpler PAC-Bayesian Bounds for Hostile Data

Résumé

PAC-Bayesian learning bounds are of the utmost interest to the learning community. Their role is to connect the generalization ability of an aggregation distribution $\rho$ to its empirical risk and to its Kullback-Leibler divergence with respect to some prior distribution $\pi$. Unfortunately, most of the available bounds typically rely on heavy assumptions such as boundedness and independence of the observations. This paper aims at relaxing these constraints and provides PAC-Bayesian learning bounds that hold for dependent, heavy-tailed observations (hereafter referred to as \emph{hostile data}). In these bounds the Kullack-Leibler divergence is replaced with a general version of Csisz\'ar's $f$-divergence. We prove a general PAC-Bayesian bound, and show how to use it in various hostile settings.
Fichier principal
Vignette du fichier
main.pdf (158.27 Ko) Télécharger le fichier
Origine : Fichiers produits par l'(les) auteur(s)

Dates et versions

hal-01385064 , version 1 (20-10-2016)
hal-01385064 , version 2 (23-10-2016)
hal-01385064 , version 3 (23-05-2019)

Identifiants

  • HAL Id : hal-01385064 , version 1

Citer

Pierre Alquier, Benjamin Guedj. Simpler PAC-Bayesian Bounds for Hostile Data. 2016. ⟨hal-01385064v1⟩
365 Consultations
274 Téléchargements

Partager

Gmail Facebook X LinkedIn More