\centerline {\bf CS/MATH 3414 -- Norms and Inner Products}\bigskip Let $V$ be a real vector space. An {\it inner product} on $V$ is a function from $V\times V \to$ reals. The inner product of two vectors $u,v$ is denoted by $(u,v)$ or $\langle u,v \rangle$, and has the properties: \item {1.} $(u+v,w) = (u,w) + (v,w)$ for all $u,v,w \in V$; \item {2.} $(\alpha u,v) = \alpha(u,v)$ for all real $\alpha$ and $u,v \in V$; \item {3.} $(u,v) = (v,u)$ for all $u,v \in V$; \item {4.} $(u,u) > 0$ for all $u\neq0, u \in V$. \medskip\hrule\medskip \leftline {The following are examples of inner products:} \item {1.} $V = E^n\equiv\{\hbox{$n$-tuples of real numbers}\}$, $(x,y) = \sum^n_{i=1}x_iy_i$, \item {2.} $V={\cal P}_2=\{\hbox{polynomials of degree} \le2\},\ (p,q)= \sum^3_{i=1}p(i)q(i)$, \item {3.} $V ={\cal P}=\{\hbox{all polynomials}\}, \ (p,q)= \int_{-1}^1 p(x)q(x)dx$. \medskip\hrule \noindent Two vectors $u,v$ are {\it orthogonal} (think ``perpendicular'') if $(u,v) = 0$. A sequence of vectors $\varphi_1, \varphi_2, ..., \varphi_k$ is {\it orthonormal} if $$(\varphi_i, \varphi_j)=\delta_{ij}=\cases{1,i=j\cr 0,i\neq j}\quad \forall\ i,j.$$ \noindent Example: the vectors $\varphi_0=1$, $\varphi_1=x$, $\varphi_2=3x^2-1$, $\varphi_3=5x^3-3x$ are mutually orthogonal with respect to the inner product $(f,g)=\int_{-1}^1 f(x)g(x)dx$ on the vector space $\cal P$. \medskip\hrule\medskip \noindent A {\it norm} on $V$ is a function from $V\to$ reals denoted by $\Vert u \Vert$ with the properties \item {1.} $\Vert u \Vert \geq 0$ for all $u \in V$ with equality if and only if $u=0$; \item {2.} $\Vert \alpha u\Vert = \vert \alpha\vert \ \Vert u \Vert$ for all real $\alpha$ and $u \in V$; \item {3.} $\Vert u+v \Vert \leq \Vert u \Vert + \Vert v \Vert$ for all $u, v \in V$. \noindent Examples of norms: \item {1.} $V=E^n$, $\Vert x\Vert_p = \left(\sum_{i=1}^n \vert x_i\vert^p\right)^{1/p}$, $p \geq 1$. \item {2.} $V=E^n$, $\Vert x \Vert_{\infty} = \max_{1\leq i\leq n}|x_i|$. \item {3.} $V={\cal P}=\{\hbox{polynomials}\}$, $\Vert p \Vert_{\infty} = \max_{a\leq x \leq b} |p(x)|$. \item {4.} (IMPORTANT) $\Vert u \Vert = \sqrt {(u,u)}$, where $(u,v)$ is any inner product on any vector space V. \item{5.} $\Vert f \Vert_1 = \int^b_a |f(x)| dx$ is a norm on $V=C[a,b]=\{$~continuous functions defined on the closed interval $[a,b]\,\}$. \medskip\hrule\medskip \leftline {IMPORTANT FACT (Cauchy-Schwarz inequality): $|(u,v)| \leq \Vert u\Vert \ \Vert v \Vert$.} \medskip\hrule\medskip \leftline {\it Matrix norms:} Choose a norm $\Vert \cdot \Vert$ on E$^n$, and let A be an $n\times n$ matrix. The {\it norm of A} with respect to the vector norm $\Vert \cdot \Vert$ is $$\Vert A \Vert = \sup_{\|x\|\ne0} {\|Ax\|\over\|x\|} = \max_{\Vert x \Vert = 1} \Vert Ax \Vert.$$ \bigskip \settabs 3 \columns \+For vector norm & the corresponding matrix norm is\cr \+\cr \+ $\Vert x \Vert_{\infty}$ & $\Vert A \Vert_{\infty} = \max_i \sum^n_{j=1} \vert A_{ij} \vert$\cr \smallskip \+ $\Vert x \Vert_1$ & $\Vert A\Vert_1 = \max_j \sum^n_{i=1} \vert A_{ij} \vert$\cr \smallskip \+ $\Vert x \Vert_2$ & $\Vert A\Vert_2 = \sqrt{\rho\left(A^tA\right)} =$ square root of largest eigenvalue of $A^tA$\cr \vfil\eject \centerline{\bf Approximation in Normed Linear Spaces}\bigskip \noindent{\bf Definition}. A vector space $V$ with an inner product $\langle u,v\rangle$ is called an {\it inner product space}. A vector space $V$ with a norm $\|x\|$ is called a {\it normed linear space}. A normed linear space is {\it strictly convex\/} if $\|x\| = \|y\| = \|{1\over2}(x+y)\| = 1 \Rightarrow x=y$. \medskip\noindent{\bf Theorem}. Every finite dimensional subspace $S$ of a normed linear space $V$ contains a point closest to an arbitrary point $x\in V$. If $V$ is strictly convex, then there is a unique closest point in $S$ to $x$. \medskip\hrule\medskip Applications of above theorem: \item{1.}$V=C[a,b]$, $S={\cal P}_n=\{$polynomials of degree $\le n\}$. For $f\in C[a,b]$, there exists a polynomial $P(x)$ of degree $\le n$ which minimizes $$\displaylines{ \|f-P\|_\infty = \max\limits_{a\le x\le b} |f(x)-P(x)|\cr \rm or \hfill\cr \|f-P\|_2 = \left(\int_a^b |f(x)-P(x)|^2dx \right)^{1/2}\cr \rm or \hfill\cr \|f-P\|_1 = \int_a^b |f(x)-P(x)|\,dx.\cr}$$ The ``best'' approximation $P$ is unique only for the strictly convex norm $\|\cdot\|_2$. \item{2.}$V=C[a,b]$, $S=\{$trigonometric polynomials of degree $\le n\}$. For $f\in C[a,b]$, there exists a trigonometric polynomial $$T_n(x)=a_0+\sum_{k=1}^n a_k\cos kx + b_k\sin kx$$ of degree $\le n$ which minimizes $\|f-T_n\|_\infty$, or $\|f-T_n\|_2$, or $\|f-T_n\|_1$. \item{3.}$V=C[a,b]$, $S=\{$continuous functions which are linear in each subinterval $[a+ih,a+(i+1)h]$, $i=0$, $\ldots$, $n-1$, $h=(b-a)/n\}$. For $f\in C[a,b]$, there exists $p\in S$ which minimizes $\|f-p\|_\infty$, or $\|f-p\|_2$, or $\|f-p\|_1$. Note that this $p$ normally does not interpolate $f$ at the nodes $a+ih$. \medskip\hrule\medskip {\def\v{\varphi}\def\la{\left\langle}\def\ra{\right\rangle} \noindent{\bf Theorem}. Let $V$ be an inner product space, and $S\subset V$ a finite dimensional subspace with orthonormal basis $\v_1$, $\ldots$, $\v_n$. Then for any point $f\in V$, the unique closest point in $S$ to $f$ is given by $$P_S(f)=\sum_{i=1}^n \la f,\v_i\ra\v_i.$$ $P_S(f)$ is called the {\it projection\/} of $f$ onto $S$, and $\la f,\v_i\ra$ are called {\it Fourier coefficients}. Proof. Let $v=\sum_{i=1}^n \alpha_i\v_i$ be an arbitrary point in $S$. Then $$\eqalign{\la f-v,f-v\ra &= \la f-P_S(f)+P_S(f)-v, f-P_S(f)+P_S(f)-v \ra\cr &= \la f-P_S(f),f-P_S(f)\ra +2\la f-P_S(f), P_S(f)-v\ra + \la P_S(f)-v,P_S(f)-v\ra\cr}$$ and $$\eqalign{\la f-P_S(f),P_S(f)-v\ra &= \la f-\sum_{i=1}^n \la f,\v_i\ra\v_i, \sum_{j=1}^n\beta_j\v_j \ra = \sum_{j=1}^n\beta_j \la f-\sum_{i=1}^n \la f,\v_i\ra\v_i, \v_j \ra\cr &=\sum_{j=1}^n\beta_j \la f-\la f,\v_j\ra\v_j,\v_j \ra = \sum_{j=1}^n\beta_j \left(\la f,\v_j\ra - \la f,\v_j\ra \la \v_j,\v_j \ra\right)\cr &=0.\cr}$$ Therefore $$\|f-P_S(f)\|^2 = \la f-P_S(f),f-P_S(f)\ra \le \la f-v,f-v\ra = \|f-v\|^2$$ with equality $\Leftrightarrow\la P_S(f)-v,P_S(f)-v\ra = \|P_S(f)-v\|^2=0 \Leftrightarrow v=P_S(f)$. That is, $P_S(f)$ is the unique closest point in $S$ to $f$. \QED \medskip \noindent{\bf Corollary}. $f-P_S(f)$ is orthogonal to $S$. \noindent{\bf Corollary (Parseval's Inequality)}. $\displaystyle \sum_{i=1}^n \la f,\v_i\ra^2 \le \la f,f\ra$. \noindent Note: if the basis $\v_1$, $\ldots$, $\v_n$ is merely orthogonal, then $$P_S(f)=\sum_{i=1}^n {\la f,\v_i\ra\over\la \v_i,\v_i \ra}\v_i$$ and the Fourier coefficients are $\la f,\v_i\ra / \la \v_i,\v_i\ra$. The projection operator $P_S(f)$ is an idempotent homomorphism $P_S:V\to V$ ($P_S\circ P_S=P_S$), and $\|P_S\|=1$. } \vfil\eject