Linear Tranformation

$$\newcommand{\0}{\textbf 0} \newcommand{\1}{\textbf 1} \newcommand{\x}{\boldsymbol x} \newcommand{\y}{\boldsymbol y} \newcommand{\z}{\boldsymbol z} \newcommand{\u}{\boldsymbol u} \newcommand{\v}{\boldsymbol v} \newcommand{\w}{\boldsymbol w} \newcommand{\L}{\mathcal L} \newcommand{\vn}{\{ {\boldsymbol v}_1, {\boldsymbol v}_2,\ldots,{\boldsymbol v}_n\} } \newcommand{\wn}{\{ {\boldsymbol w}_1, {\boldsymbol w}_2,\ldots,{\boldsymbol w}_n\} } \newcommand{\Ker}{\text{ker}} \renewcommand{\Im}{\text{Im}} \newcommand{\Span}{\text{span}} \newcommand{\Norm}{\unlhd} \newcommand{\Null}{\text{nullity}} \newcommand{\Rank}{\text{rank}} \newcommand{\Hom}{\text{Hom}} $$

1. Linear Transformation

Definition. Linear transformation

Let $V$ and $W$ be vector spaces (over $F$). We call a function $T:V\to W$ a linear transformation from $V$ to $W$ if $\forall \x,\y \in V,\forall c\in F$.

$$\begin{aligned} T(\x+\y)&=T(\x)+T(\y)\\ T(c\x)&=cT(\x) \end{aligned} $$

If $T$ is a linear transformation, we often just call $T$ linear. It could be checked that $T$ is linear iff for any $\{\x_k\}_{k=1}^n$ and $\{a_k\}_{k=1}^n$, we have

$$T(\sum_{k=1}^n a_k\x_k)=\sum_{k=1}^na_kT(\x_k) $$

By definition, a linear transformation is just a module homomorphism $T$ in $\Hom_F(V,W)$

Definition. Identity transformation
For vector space $V$ over $F$, the identity transformation $I_V$ is defined as follows

$$\begin{aligned} I_V:V&\to V\\ \v&\mapsto \v \end{aligned} $$

We can write $I$ instead of $I_V$

Definition. Zero transformation
For vector space $V$ and $W$ over $F$, the zero transformation $T_0$ is defined as follows

$$\begin{aligned} T_0:V&\to W\\ \x&\mapsto \0 \end{aligned} $$

Definition. Null space / kernel, range
Let $V$ and $W$ be two vector spaces. $T:V\to W$ is linear. We define the null space $N(T)$ / kernel $\Ker(T)$ of $T$ as

$$\Ker(T):=\{\x\in V\mid T(\x)=\0\}=:T^{-1}(\{\0\}) $$

The range / image of $T$ is defined as $\Im(T):=\{T(\v)\mid \v\in V\}=:T(V)$

Theorem. Let $V$ and $W$ be vector spaces and $T:V\to W$ is linear. Then $\Ker(T)$ is a subspace of $V$, and $\Im(T)$ is a subspace of $W$.
Pf. Easy to show using definition of subspaces.

From the perspective of groups, $N$ is the kernel of some group homomorphism from $G$ is equivalent to $N\Norm G$. Here $V$ is abelian, so every kernel should be corresponding a subspace of $V$.

****Theorem**. Let $V$ and $W$ be vector spaces and $T:V\to W$ is linear. If $\beta=\{\v_k\}_{k=1}^n$ is a basis for $V$, then

$$\Im(T)=\Span(T(\beta))=\Span(\{T(\v_1), T(\v_2), \ldots, T(\v_n)\}) $$

Definition. Nullity / rank
Let $V$ and $W$ be two vector spaces. $T:V\to W$ is linear. We define the nullity (denoted by $\Null(T)$ ) and rank of $T$ (denoted by $\Rank(T)$) as follows

$$\begin{aligned} \Null(T)&:=\dim(\Ker(T))\\ \Rank(T)&:=\dim(\Im(T)) \end{aligned} $$

Theorem. Dimension theorem
Let $V$ and $W$ be vector spaces, let $T:V\to W$ be linear. If $V$ is finite-dimensional, then

$$\Null(T)+\Rank(T)=\dim(V) $$

Pf.

Lemma. Properties of linear transformations between equal-dimensional vector spaces
Let $V$ and $W$ be vector spaces with equal (finite) dimensions, and let $T:V\to W$ be linear. TTFE

  1. $T$ is surjective
  2. $T$ is injective
  3. $\Rank(T)=\dim(V)$
    Pf. Firstly note that $T$ is injective iff. $\Ker(T)=\{\0\}$ because
$$T(\x)=T(\y)\iff T(\x-\y)=\0\iff (\x-\y)\in\Ker(T) $$

So by Dimension Theorem 1. and 3. are equivalent. Note that $T(V)=\Im(T)$ is a subspace of $W$, so

$$\Rank(T)=\dim(V)\iff \dim(\Im(T))=\dim(W)=\dim(V)\iff \Im(T)=W $$

that is, if a space have a subspace with the same dimension, then they are equal.

Theorem. Uniqueness of linear transformation between same dimensional vector spaces
Let $V$ and $W$ be vector spaces over $F$. Suppose $\vn$ is a basis for $V$. For any $\wn$, there exists a unique linear transformation $T:V\to W$ such that

$$\forall i\in\{1,2,\ldots,n\}.T(\v_i)=\w_i $$

Definition. Invariant
Let $V$ be a vector space, $T:V\to V$ is linear. A subspace $W$ of $V$ is said to be $T$-invariant if

$$\forall x\in W. T(x)\in W $$

that is $T(W)\subset V$

2. Matrix Representation

Definition. Ordered basis
An ordered basis is nothing more than a basis endowed with a specific order.

We need to fix the order of basis so that we can better represent a transformation as a matrix.

Definition. Coordinate vector
Let $\beta=\{\u_1,\u_2,\ldots,\u_n\}$ be an ordered basis for a finite dimensional vector space $V$. For $x\in V$, let $a_1, a_2,\ldots, a_n$ be the unique scalars such that

$$x=\sum_{i=1}^na_i\u_i $$

We define the coordinate vector of $x$ relative to $\beta$, denoted as $[x]_\beta$, by

$$[x]_\beta=\begin{pmatrix}a_1\\a_2\\\vdots \\ a_n\end{pmatrix} $$

Definition. Addition and scalar multiplication of linear transformations
Let $T, U$ be linear transformations from $V$ to $W$ where $V$ and $F$ are vector spaces over $F$. Let $a\in F$, $x\in V$ be an arbitrary vector.
Define $T+U:V\to W$ as

$$T+W:x\mapsto T(x)+U(x) $$

and define $aT$ as

$$aT:x\mapsto a(T(x)) $$

Theorem. The collection of all linear transformations is a vector space
Let $\L(V,W)$ denote the collection of all linear transformations from $V$ to $W$ where $V$ and $W$ are vector spaces over $F$. Then $\L(V, W)$ is a vector space over $F$.

Theorem. A composition of linear transformations is still linear

Definition. Invertible transformation

Definition. Standard representation

Definition. Similar matrices
Let $A$ be $B$ be matrices in $M_{n\times n}(F)$. We say that $A$ is similar to $B$ if there exists an invertible matrix $Q$ such that $A=Q^{-1}BQ$

the relation of “is similar to” is obviously an equivalence relation.