Computing the Bias of Constant-step Stochastic Approximation with Markovian Noise - POLARIS - Performance analysis and Optimization of LARge Infrastructure and Systems
Communication Dans Un Congrès Année : 2024

Computing the Bias of Constant-step Stochastic Approximation with Markovian Noise

Résumé

We study stochastic approximation algorithms with Markovian noise and constant step-size $\alpha$. We develop a method based on infinitesimal generator comparisons to study the bias of the algorithm, which is the expected difference between $\theta_n$ ---the value at iteration $n$--- and $\theta^*$ ---the unique equilibrium of the corresponding ODE. We show that, under some smoothness conditions, this bias is of order $O(\alpha)$. Furthermore, we show that the time-averaged bias is equal to $\alpha V + O(\alpha^2)$, where $V$ is a constant characterized by a Lyapunov equation, showing that $\esp{\bar{\theta}_n} \approx \theta^*+V\alpha + O(\alpha^2)$, where $\bar{\theta}_n=(1/n)\sum_{k=1}^n\theta_k$ is the Polyak-Ruppert average. We also show that $\bar{\theta}_n$ converges with high probability around $\theta^*+\alpha V$. We illustrate how to combine this with Richardson-Romberg extrapolation to derive an iterative scheme with a bias of order $O(\alpha^2)$.
Fichier principal
Vignette du fichier
stochastic-approximation-bias.pdf (1005.29 Ko) Télécharger le fichier
Origine Fichiers produits par l'(les) auteur(s)

Dates et versions

hal-04837291 , version 1 (13-12-2024)

Licence

Identifiants

Citer

Sebastian Allmeier, Nicolas Gast. Computing the Bias of Constant-step Stochastic Approximation with Markovian Noise. NeurIPS 2024 -38th Annual Conference on Neural Information Processing Systems, Dec 2024, Vancouver, Canada. pp.1-23. ⟨hal-04837291⟩
0 Consultations
0 Téléchargements

Altmetric

Partager

More