The Robbins-Siegmund theorem gives conditions on a nonnegative almost supermartingale that ensure its almost sure convergence. It can be seen as a generalization of Doob’s martingale convergence theorem for nonnegative supermartingales. It is fairly easy to derive a number of almost sure convergence results from the Robbins-Siegmund theorem, e.g., the strong law of large numbers. In this post we show how it can be used to show convergence of a stochastic gradient descent algorithm.
Last updated on
Jan 25, 2026
8 min read