Posts

Showing posts from January, 2023

Useful inequalities in estimation

Here I would like to collect a bunch of useful inequalities in estimation. This list is by no means complete, but I will keep adding new things when appropriate. (1) Cauchy-Schwarz inequality : Let $(X,\langle\cdot,\cdot\rangle)$ be an inner product space, and $||x||=\sqrt{\langle x,x \rangle}$ (think of $L^2$ norm), then for all $x,y\in X$, $|\langle x,y \rangle| \leq ||x||\cdot||y||$. This inequality has an obvious geometric interpretation for $\mathbb{R}^2$ (or $\mathbb{R}^3$) vector space, but let's prove it for the general case. Proof: If $x=0$, the general inequality obviously holds. If $x\neq 0$, let $\hat{x}\equiv \frac{x}{||x||}$, $y_{\parallel}\equiv \langle \hat{x},y \rangle \hat{x}$, $y_{\perp}\equiv y-y_{\parallel}$, then $0\leq ||y_{\perp}||^2 = ||y-y_{\parallel}||^2 = ||y-\langle \hat{x},y \rangle\hat{x}||^2 = \langle  y-\langle \hat{x},y \rangle\hat{x}, y-\langle \hat{x},y \rangle\hat{x} \rangle = \langle  y-\langle \hat{x},y \rangle\hat{x}, y \rangle - \langle...