Member-only story

Machine Learning — The Intuition of Hoeffding’s Inequality

Helene
5 min readJan 5, 2021

--

Said in simple terms, probability theory is the mathematical study of the uncertain — it allows us (and thereby also the computer) to reason and make decisions in situations where complete certainty is impossible. Probability theory plays a center-stage role in machine learning theory, as many learning algorithms rely on probabilistic assumptions about the given data. Especially one probability bound has had an enormous impact on machine learning theory — the Hoeffding’s Bound.

This article is meant to understand the inequality behind the bound, the so-called Hoeffding’s Inequality. It will try to give a good mathematical and intuitive understanding of it. In the end, the inequality will be placed in the context of machine learning — specifically about how it is used when talking about generalization errors and hypotheses. It will simply be a slight introduction but will be further discussed in a future series of articles.

The Intuition Behind Hoeffding’s Inequality

Let us first present the famous Hoeffding’s Inequality:

In mathematical terms, Hoeffding’s inequality gives an upper bound on the probability that the sum of bounded independent random variables deviates from its expected value by more than a certain amount. Now, let us first define the different parts of the equation — and afterward, we can try to understand it more intuitively than what was presented above.

--

--

Responses (2)