\centerline {\bf CS/MATH 3414 -- Linear Algebra}\bigskip \noindent{\bf Definition}. A vector space over the field $F$ (whose elements are called {\it scalars}) consists of a set $V$ (whose elements are called {\it vectors}) together with a binary operation $V\times V \to V$ denoted by $(x,y)\to x+y$, and a binary operation $F\times V \to V$ denoted by $(\lambda,x)\to\lambda x$, with the properties: \item{(1)}$(x+y)+z=x+(y+z)\quad\forall x,y,z\in V$, \item{(2)}$x+y=y+x\quad\forall x,y\in V$, \item{(3)}$\exists$ a unique element $0\in V$ such that $x+0=x\quad\forall x\in V$, \item{(4)}$\forall x\in V\quad\exists$ a unique element $-x\in V$ such that $x+(-x)=0$, \item{(5)}$(\lambda+\mu)x=\lambda x+\mu x\quad\forall\lambda,\mu\in F$ and $\forall x\in V$, \item{(6)}$(\lambda\mu)x=\lambda(\mu x)\quad\forall\lambda,\mu\in F$ and $\forall x\in V$, \item{(7)}$\lambda(x+y)=\lambda x+\lambda y\quad\forall\lambda\in F$ and $\forall x,y\in V$, \item{(8)}$0x=0\quad\forall x\in V$, \item{(9)}$1x=x\quad\forall x\in V$. \medskip\noindent {\bf Definition}. A {\it subspace\/} $S$ of a vector space $V$ is a subset $S\subset V$ such that $S$ is also a vector space. \noindent{\bf Definition}. The vectors $x_1,\ldots,x_k\in V$ are {\it independent\/} if the linear combination $\alpha_1x_1+\cdots+ \alpha_k x_k=0 \Rightarrow \alpha_1=\cdots=\alpha_k=0$. $x_1,\ldots,x_k$ are said to {\it generate\/} (or span) $V$ if for each $x\in V$ $\exists\alpha_1, \ldots,\alpha_k\in F$ such that $x=\alpha_1x_1+\cdots +\alpha_kx_k$. The vectors $x_1,\ldots,x_k$ are a {\it basis\/} for $V$ if they are independent and generate $V$. $V$ is called {\it finite dimensional\/} if $V$ is generated by a finite set of vectors. \medskip\noindent{\bf Theorem}. Every finite dimensional vector space $V$ has a basis. \noindent{\bf Theorem}. Every basis of a finite dimensional vector space $V$ has the same number of elements. \noindent{\bf Definition}. The {\it dimension\/} dim $V$ of a finite dimensional vector space $V$ is the number of elements in a basis of $V$. \noindent{\bf Theorem}. Let $x_1,\ldots,x_n\in V$ and dim $V=n$. Then the following are equivalent: \item{(1)}$x_1,\ldots,x_n$ are a basis for $V$, \item{(2)}$x_1,\ldots,x_n$ are independent, \item{(3)}$x_1,\ldots,x_n$ generate $V$. \medskip\hrule Examples of vector spaces: \item{1.}$F=Q=\{$rational numbers$\}$, $V=F^n=\{n$-tuples of elements of $F$. \item{2.}$F=E=\{$real numbers$\}$, $V=F^n$. \item{3.}$F=C=\{$complex numbers$\}$, $V=F^n$. \item{4.}$F$ a field, $V=F^{m\times n}=\{m\times n$ matrices with elements in $F\}$. \item{5.}$F$ a field, $V={\cal P}_n(F)=\{$polynomials of degree $\le n$ with coefficients in $F\}$. \item{6.}$F=E$, $V=C[a,b]=\{$continuous real-valued functions defined on the closed interval $[a,b]\}$. \smallskip\hrule\medskip \noindent{\bf Definition}. Let $V$ and $W$ be vector spaces over the field $F$. A function $T:V\to W$ is a {\it linear transformation\/} ({\it homomorphism}) if $T(\alpha x+\beta y)=\alpha T(x)+\beta T(y)$ for all $x,y\in V$ and $\alpha,\beta\in F$. The {\it kernel\/} (null space) of $T$ is $$\hbox{ker }T=\{x\mid T(x)=0\}.$$ The {\it image\/} (range) of $T$ is $$\hbox{im }T=\{T(x)\mid x\in V\}.$$ The identity function $id_V$ on $V$ maps every vector to itself. $T:V\to W$ is {\it invertible\/} if $\exists$ a homomorphism $S:W\to V$ such that $S\circ T=id_V$ and $T\circ S=id_W$. \medskip \noindent{\bf Theorem}. Let $V$ be a finite dimensional vector space and $T:V\to W$ a homomorphism. Then $$\hbox{dim }V=\hbox{dim (ker }T) + \hbox{dim (im }T).$$ \noindent{\bf Theorem}. Let $a_1,\ldots,a_n$ be a basis for the vector space $V$, and $T:V\to W$ be a homomorphism. Then the following hold: \item{(1)}ker $T=\{0\} \Leftrightarrow T(a_1)$, $\ldots$, $T(a_n)$ are independent; \item{(2)}im $T=W\Leftrightarrow T(a_1)$, $\ldots$, $T(a_n)$ generate $W$; \item{(3)}$T(a_1)$, $\ldots$, $T(a_n)$ are a basis for $W \Leftrightarrow T$ is invertible. \def\E#1#2{E^{#1\times#2}} \bigskip\hrule\smallskip\centerline{\bf Matrix Theory} A real $m\times n$ matrix $A\in \E mn$ can be regarded as a linear transformation from $E^n$ to $E^m$, sending a vector $x\in E^n$ to the vector $Ax\in E^m$. Thus all of the above theorems apply to matrices. \noindent{\bf Definition}. For a scalar $A\in \E 11$, $\det A=A$. For $A\in \E nn$, the {\it determinant\/} of $A$ is $$\det A=\sum_{i=1}^n (-1)^{i+1}A_{i1}\,\det A[i,1],$$ where $A[i,j]$ is the submatrix of $A$ obtained by deleting the $i$th row and $j$th column. The number cof~$A_{ij}=(-1)^{i+j} \det A[i,j]$ is called the {\it cofactor\/} of the $i,j$ matrix element $A_{ij}$. The {\it adjoint\/} matrix of $A$, denoted adj~$A$, has as its $i,j$ element the number cof~$A_{ji}$. \noindent{\bf Lemma}. Let $A\in \E nn$. For any $j$, $1\le j\le n$, $$\det A=\sum_{i=1}^n (-1)^{i+j}A_{ij}\,\det A[i,j] = \sum_{i=1}^n (-1)^{j+i}A_{ji}\,\det A[j,i].$$ \noindent{\bf Lemma}. Let $A,B\in \E nn$ and $I$ denote the $n\times n$ identity matrix. Then \item{(1)}$\det A=\det A^t$, \item{(2)}$\det(AB)=\det A\;\det B$, \item{(3)}$A\,(\hbox{adj }A)=(\hbox{adj }A)\,A=(\det A)I$. \vfil\goodbreak \noindent{\bf Theorem}. Let $A\in \E nn$. Then the following are equivalent: \item{(1)}ker $A=\{0\}$ (columns of $A$ are independent), \item{(2)}im $A=E^n$ (columns of $A$ generate $E^n$), \item{(3)}the columns $A_{\cdot1},\ldots,A_{\cdot n}$ of $A$ are a basis for $E^n$, \item{(4)}the rows $A_{1\cdot},\ldots,A_{n \cdot}$ of $A$ are a basis for $E^n$, \item{(5)}$\det A\ne0$, \item{(6)}$A$ is invertible. \noindent{\bf Definition}. A matrix $A\in\E nn$ is upper triangular, lower triangular, diagonal, if $A_{ij}=0$ for $i>j$, $A_{ij}=0$ for $i