Aka concentration of measure inequalities.
Expectation based deviation bound
(Aka Markov’s inequality). If
Averaging argument. If
This technique is used repeatedly in other deviation bounds based on variance and moment generating functions.
Variance based deviation bound
(Aka Chebyshev’s inequality). By Markov’s inequality:
Use in estimation of mean
Exponentially small deviation bounds
General technique
(Chernoff)
For random variable sequences
Pairwise independent RVs
Use variance based deviation bounds, as variance of pairwise independent RVs is an additive function.
Sum of n-wise independent RVs
Bounds from MGF’s.
If
So, if
If d in (0,1],
If d in (0,1],
So,
Can be used in both additive and multiplicative forms.
Goodness of empirical mean
Now,
Code length divergence bound
Let
\pf{Suppose that
So, taking the union bound over all
Using the connection between the code length divergence and the total variation distance:
Additive deviation bounds
See Azuma inequality section.
iid RV: Tightness of the Chernoff bound
(Cramer) Take
\part{Probabilistic Analysis Techniques}