1.3 Inequalities
1.3.1 Typical tail bound inequalities
Theorem 1.8 (Markov's Inequality) Let \(X\) be a non-negative RV and \(\mathbb{E}(X)\) exists. Then, for each \(k >0\), \[ \mathbb{P}( X > k) \leq \frac{\mathbb{E}(X)}{k}. \]
Theorem 1.9 (Chebyshev's Inequality) Let \(X\) be a RV such with expected value \(\mu\) and standard variation \(\sigma\). Then for each \(k >0\), \[ \mathbb{P}(| X - \mu | > k \sigma ) \leq \frac{1}{k^2}. \]
1.3.2 Exponential concentration inequalities
Theorem 1.10 (Mill's inequality) Let \(Z \sim N(0,1)\). Then, for each \(t >0\), \[ \mathbb{P}(|Z| > t) \leq \sqrt{\frac{2}{\pi}} \frac{e^{-t^2/2}}{t}. \]
Theorem 1.11 (Hoeffding's inequality) Let \(X_1, \dots, X_n\) be independent RVs such that \(\mathbb{E}( X_i ) = 0\), \(a_i \leq Y_i \leq b_i\). For each \(\epsilon >0\) and \(t>0\), we have \[ \mathbb{P}\left( \sum_{i=1}^n X_i \geq \epsilon \right) \leq e^{-t\epsilon} \prod_{i=1}^n e^{t^2(b_i - a_i)^2/8}. \]
Exercise 1.22 Let \(X_1, \dots, X_n\) be independent RVs such that \(\mathbb{E}( X_i ) = 0\), \(a_i \leq Y_i \leq b_i\). Show that for each \(\epsilon >0\) and \(t>0\), we have \[ \mathbb{P}\left( \left| \frac{1}{n}\sum_{i=1}^n X_i \right| \geq t \right) \leq 2 \exp\left( - \frac{2 n^2 t^2}{\sum_{i=1}^n (b_i - a_i)^2} \right). \]
1.3.3 Inequalities for expectations
Theorem 1.12 (Cauchy-Schwartz inequality) Let \(X, Y\) be RVs with finite variances. Then, \[\mathbb{E}( |XY| ) \leq \sqrt{ \mathbb{E}(X^2) \mathbb{E}(Y^2) }.\]
Theorem 1.13 (Jensen's inequality) Suppose \(g:\mathbb{R}\to \mathbb{R}\) is a convex function. Then \[ \mathbb{E}g(X) \geq g(\mathbb{E}X). \]